WO2018103371A1 - Processing method in video recording and apparatus - Google Patents

Processing method in video recording and apparatus Download PDF

Info

Publication number
WO2018103371A1
WO2018103371A1 PCT/CN2017/098515 CN2017098515W WO2018103371A1 WO 2018103371 A1 WO2018103371 A1 WO 2018103371A1 CN 2017098515 W CN2017098515 W CN 2017098515W WO 2018103371 A1 WO2018103371 A1 WO 2018103371A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
imaging
main camera
main
image data
Prior art date
Application number
PCT/CN2017/098515
Other languages
French (fr)
Chinese (zh)
Inventor
高鹏
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018103371A1 publication Critical patent/WO2018103371A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the present disclosure relates to, but is not limited to, the field of imaging technology, and in particular, a processing method and apparatus in imaging.
  • High-speed camera is one of them. In the scene we also call it "bullet time.” High-speed camera is to shoot high-speed moving objects with high frame rate camera. Due to the high frame rate, a large amount of motion details can be captured. In the later playback, due to the usual 24FPS (frames/second), it is visually created. Slow motion effect.
  • the mobile terminal can already reach 240 frames per second, and it is enough to shoot daily motion scenes at this frame rate, which can achieve better slow motion effects during playback. Moreover, the mobile terminal can ensure a high frame rate in different environments by using a fixed imaging sensor and a fixed exposure time parameter at the time of high speed imaging.
  • the mobile terminal In addition, in a scene where the brightness of the light is dark, since the exposure time cannot be adjusted, the mobile terminal The brightness of the image data acquired by the imaging sensor is low, which directly leads to insufficient brightness of the generated image data, which affects the image quality and seriously affects the user experience.
  • This paper provides a processing method and device for capturing images, which can improve the image quality of high-speed imaging.
  • An embodiment of the present disclosure provides a processing method in imaging, where the method includes:
  • the first exposure time being an exposure time of the main camera and the auxiliary camera
  • the main camera is started to be photographed, the interval time is delayed, the auxiliary camera is started, and the image data obtained by the main camera and the auxiliary camera are respectively stored in the order of imaging.
  • the determining a frame rate includes:
  • a reception frame rate selection operation is performed to determine a frame rate used for the current imaging based on the frame rate selection operation.
  • the determining the first exposure time and the second exposure time according to the frame rate includes:
  • the reciprocal of the frame rate is determined as the second exposure time.
  • the determining, according to the second exposure time, an interval at which the secondary camera delays the activation of the main camera includes:
  • a delay time of the main camera is determined, and a difference between the second exposure time and the delay time is determined as an interval time.
  • the storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence respectively includes:
  • the ith frame image data obtained by the main camera is stored, and is marked with the first mark, and the ith frame image data obtained by the auxiliary camera is stored and marked with the second mark; then, the i+ obtained by the main camera is continuously stored. 1 frame of image data, and marked with the first mark, and then store the i+1th frame of image data obtained by the auxiliary camera, and mark with the second mark, and so on;
  • N is a positive integer.
  • the method further includes:
  • the stored image data is cropped to obtain coincidence image data of the main camera and the secondary camera.
  • the cropping the stored image data to obtain the coincidence image data of the main camera and the auxiliary camera includes:
  • the ith frame image data obtained by the stored main camera is cropped according to the first crop frame, and the image data that is not coincident with the ith frame image data of the auxiliary camera is cropped;
  • the camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera constitute a coincidence camera data
  • N is a positive integer.
  • the determining of the first crop box and the second crop box includes:
  • the second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
  • the determination of the coincidence data width of the main camera includes:
  • the coincidence data width of the camera is determined as the main camera.
  • the determination of the coincidence data width of the auxiliary camera includes:
  • the coincidence data width of the camera is determined as the secondary camera.
  • the determination of the coincidence data width of the main camera includes: placing the subject on the main camera The sum of the imaging position to the right edge of the imaging screen of the main camera, the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the coincidence data width of the imaging of the main camera;
  • the determination of the coincidence data width of the auxiliary camera includes: the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the imaging position of the subject on the main camera to the imaging screen of the main camera The sum of the distances of the right edges is determined as the coincidence data width of the auxiliary camera image.
  • the method further includes:
  • the coincidence imaging data is encoded and displayed.
  • the embodiment of the present disclosure further provides a processing device in imaging, the device comprising:
  • a first determining module configured to: determine a frame rate
  • the second determining module is configured to: determine, according to the frame rate, a first exposure time and a second exposure time, where the first exposure time is an exposure time of the main camera and the auxiliary camera;
  • a third determining module configured to: determine, according to the second exposure time, an interval time during which the auxiliary camera delays starting the main camera;
  • the startup module is set to: start the main camera to capture, delay the interval time, start the auxiliary camera to capture, and store the camera data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
  • the apparatus further includes: a cropping module
  • the cropping module is configured to: crop the stored image data to obtain coincidence image data of the main camera and the auxiliary camera.
  • the device further includes: a display module;
  • the display module is configured to: encode and display the coincidence imaging data.
  • the cropping module is further configured to: determine a first cropping frame according to a screen size and a coincidence data width of the image captured by the main camera; determine a first image according to a screen size and a coincidence data width of the auxiliary camera image capturing Two cropping frames.
  • the cropping module is further configured to:
  • the distance from the imaging position of the subject on the main camera to the right edge of the main camera's camera screen Determining the sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and determining the coincidence data width of the imaging of the primary camera;
  • the coincidence data width of the camera or,
  • Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed, implement a processing method in the above imaging.
  • the processing method and apparatus in the imaging determines a frame rate; determines a first exposure time and a second exposure time according to the frame rate, wherein the first exposure time is an exposure time of the main camera and the auxiliary camera; Determining, according to the second exposure time, an interval interval between the auxiliary camera and the start of the main camera; starting the main camera, delaying the interval, starting the auxiliary camera, and storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence.
  • the main camera and the auxiliary camera increase the imaging brightness by the first exposure time, and the second exposure time is used as the interval between the activation of the main camera and the auxiliary camera, so that high-speed imaging can be realized, which can improve the imaging quality of the high-speed imaging.
  • FIG. 1 is a schematic flowchart showing an implementation process of a processing method in imaging according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a processing device in imaging according to an embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of an implementation process for improving image quality of high speed imaging according to an embodiment of the present disclosure
  • FIG. 4 is a timing diagram of starting shooting of a main camera and a secondary camera according to an embodiment of the present disclosure
  • FIG. 5 is a schematic diagram of the use of a buffer according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure
  • FIG. 7 is another schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an implementation process of a processing method in imaging, which may include the following steps:
  • Step 101 Determine a frame rate.
  • the mobile terminal receives a frame rate selection operation of the user, and determines a frame rate used by the current camera according to a frame rate selection operation input by the user.
  • the user can select the frame rate according to the brightness of the current camera environment.
  • the frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS.
  • the mobile terminal may be a mobile phone, a PDA (Personal Digital Assistant), a PAD (Portable Android Device), or the like having a camera function.
  • Step 102 Determine a first exposure time and a second exposure time according to the frame rate, where the first exposure time is an exposure time of a primary camera and a secondary camera.
  • the determining the first exposure time according to the frame rate may include:
  • the mobile terminal determines twice the reciprocal of the frame rate as the first exposure time.
  • the first exposure time as the exposure time of the main camera and the auxiliary camera, the greater the exposure time, the higher the brightness of the image obtained by the main camera or the auxiliary camera.
  • the determining the second exposure time according to the frame rate comprises:
  • the mobile terminal determines the reciprocal of the frame rate as the second exposure time.
  • the second exposure time can be used to ensure a high frame rate of the main camera and the auxiliary camera image.
  • Step 103 Determine, according to the second exposure time, an interval time during which the secondary camera delays the startup of the main camera.
  • the determining, according to the second exposure time, that the auxiliary camera delays the start of the main camera may include:
  • the mobile terminal determines a delay time of the main camera, and determines a difference between the second exposure time and the delay time as an interval time.
  • the delay time may be initiated by the main camera to a time interval when the main camera starts to capture.
  • Step 104 Start the main camera imaging, delay the interval time, start the auxiliary camera imaging, and store the imaging data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
  • the storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence may include:
  • the mobile terminal stores the ith frame image data obtained by the main camera, and marks the first frame, stores the ith frame image data obtained by the secondary camera, and marks the image with the second mark; and then continues to store the first camera.
  • i+1 frame camera data, and marking with the first mark, storing the i+1th frame image data obtained by the auxiliary camera, and marking with the second mark, and so on, wherein the i 1, 2 ,...N,N is a positive integer.
  • the method may further include: cutting the stored imaging data to obtain overlapping imaging data of the primary camera and the secondary camera.
  • the cropping the stored image data to obtain the coincidence image data of the main camera and the auxiliary camera may include:
  • the mobile terminal performs cropping on the ith frame image data obtained by the stored main camera according to the first cropping frame, and crops the image data that is not coincident with the ith frame image data of the auxiliary camera;
  • the camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera are heavy.
  • N is a positive integer.
  • the determining process of the first cropping frame and the second cropping frame may include:
  • the mobile terminal determines the first cropping frame according to the screen size and the coincidence data width of the camera captured by the main camera;
  • the second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
  • the determining process of the coincidence data width of the main camera may include:
  • the mobile terminal determines the distance between the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the imaging screen of the sub camera, and determines The coincidence data width of the main camera image;
  • the determining process of the coincidence data width of the auxiliary camera may include:
  • the mobile terminal determines the distance between the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and determines Coincident data width of the auxiliary camera;
  • the method further includes: encoding the coincidence imaging data; and displaying the encoded coincidence imaging data.
  • the encoding the coincidence imaging data may include:
  • the coincidence of the main camera and the auxiliary camera respectively based on the high compression ratio parameter of the video coding The camera data is compressed; wherein the techniques used for the video encoding may include, but are not limited to, H264.
  • the embodiment of the present disclosure further provides a processing device in the imaging.
  • the device may include: a first determining module 21, a second determining module 22, and a third Determining module 23 and starting module 24; wherein
  • the first determining module 21 is configured to: determine a frame rate
  • the second determining module 22 is configured to: determine, according to the frame rate, a first exposure time and a second exposure time, where the first exposure time is an exposure time of the main camera and the auxiliary camera;
  • the third determining module 23 is configured to: determine, according to the second exposure time, an interval time during which the secondary camera delays the startup of the main camera;
  • the startup module 24 is configured to: start the main camera imaging, delay the interval time, start the auxiliary camera imaging, and store the imaging data obtained by the main camera and the secondary camera respectively according to the imaging sequence.
  • the first determining module 21 is configured to: receive a frame rate selection operation of the user, and determine a frame rate used by the current camera according to a frame rate selection operation input by the user.
  • the user can select the frame rate according to the brightness of the current camera environment.
  • the frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS.
  • the second determining module 22 is configured to determine twice the reciprocal of the frame rate as the first exposure time.
  • the second determining module 22 is configured to determine the reciprocal of the frame rate as the second exposure time.
  • the third determining module 23 is further configured to: determine a delay time of the main camera; optionally, the third determining module 23 is configured to: set the second exposure time and the delay time The difference is determined as the interval time.
  • the device may further include a cropping module configured to: trim the stored image data to obtain coincidence image data of the main camera and the auxiliary camera.
  • the cropping module may be configured to: crop the ith frame image data obtained by the stored main camera according to the first crop frame, and crop the image data that is not coincident with the ith frame image data of the auxiliary camera;
  • the camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera constitute a coincidence camera data
  • N is a positive integer.
  • the cutting module may be further configured to: determine a first cropping frame according to a screen size and a coincidence data width of the camera captured by the main camera;
  • the second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
  • the cropping module may be further configured to: set a distance of an imaging position of the subject on the main camera to a right edge of the imaging screen of the main camera, and an imaging position of the subject on the auxiliary camera to an imaging screen of the auxiliary camera. The sum of the distances of the edges, determining the coincidence data width of the main camera image; or,
  • the coincidence data width of the camera or,
  • the device may further include a display module configured to: encode the coincidence imaging data; and display the encoded coincidence imaging data.
  • the display module may be configured to: compress the coincidence imaging data of the main camera and the secondary camera respectively according to a high compression ratio parameter of the video coding technology; wherein the video coding technology may include, but is not limited to, H264.
  • the first determining module 21, the second determining module 22, the third determining module 23, the starting module 24, the cropping module, and the display module may all be configured by a central processing unit (CPU, Central Processing Unit) located in the mobile terminal. ), a microprocessor (MPU, Micro Processor Unit), a digital signal processor (DSP, Digital Signal Processor), or a Field Programmable Gate Array (FPGA).
  • CPU Central Processing Unit
  • MPU Micro Processor Unit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • FIG. 3 is a schematic diagram of an implementation process for improving the image quality of a high-speed camera according to an embodiment of the present disclosure. As shown in FIG. 3, the following steps may be included:
  • Step 301 The mobile terminal enters high-speed imaging, and determines whether the frame rate selection operation of the user is received. If the frame rate selection operation of the user is received, step 302 is performed; otherwise, the shooting is directly performed.
  • the user can select the frame rate according to the brightness of the current camera environment.
  • the frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS.
  • the user can select a frame rate of 240 FPS for shooting; if the brightness of the shooting picture displayed according to step 307 is still insufficient, the user can reselect the frame rate of 120 FPS for shooting; The brightness of the shooting screen displayed in step 307, which is obtained by taking the value of 120 FPS, is still insufficient, and the user can again select the frame rate of 60 FPS to perform shooting.
  • Steps 302-303 The mobile terminal determines an frame rate used by the current camera according to the received frame rate selection operation of the user input.
  • the mobile terminal may determine twice the reciprocal of the frame rate as the first exposure time, and may determine the reciprocal of the frame rate as the second exposure time, where the first exposure time may be an exposure of the primary camera and the secondary camera. After the time, step 304 is performed.
  • Step 304 The mobile terminal determines a delay time of the main camera, and determines a difference between the second exposure time and the delay time as an interval time. Thereafter, step 305 is performed.
  • Step 305 The mobile terminal, the controller that can be the mobile terminal, starts the main camera, delays the interval time, starts the auxiliary camera, and stores the camera data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
  • the mobile terminal can store one frame of image data obtained by the main camera, mark it with the first mark, store one frame of image data obtained by the auxiliary camera, and mark it with the second mark; then, continue to store the next obtained by the main camera.
  • Frame camera data, and mark with the first mark store the next frame of image data obtained by the secondary camera, mark with the second mark, and so on, and store one frame of the camera obtained by the main camera and the secondary camera respectively.
  • step 306 is performed.
  • Step 306 The mobile terminal performs cropping on the ith frame image data obtained by the stored main camera according to the first cropping frame, and cuts out the camera data that is not coincident with the ith frame camera data of the secondary camera; and stores the image according to the second cropping frame.
  • the ith frame image data obtained by the auxiliary camera is cropped, and the image data that is not coincident with the ith frame image data of the main camera is cropped; the image data obtained by the uncut main camera and the camera data obtained by the auxiliary camera are combined to form a camera image.
  • the determining process of the first cropping frame and the second cropping frame may include: determining, by the mobile terminal, the first cropping frame according to the screen size and the coincident data width captured by the main camera; and the screen size and the coincidence data according to the auxiliary camera Width, determine the second crop box.
  • the determining process of the coincidence data width of the main camera may include: the distance that the mobile terminal sets the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and the imaging position of the subject on the auxiliary camera The sum of the distances to the left edge of the camera screen of the secondary camera determines the coincidence data width of the camera image captured by the main camera;
  • the determining process of the coincidence data width of the auxiliary camera may include: the distance that the mobile terminal sets the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the imaging position of the subject on the main camera The sum of the distances to the right edge of the camera screen of the main camera is determined as the coincidence data width of the auxiliary camera image.
  • Step 307 The mobile terminal compresses the coincidence imaging data of the main camera and the secondary camera respectively according to the high compression ratio parameter of the video encoding, and displays the compressed coincidence imaging data.
  • FIG. 4 is a timing diagram of the main camera and the auxiliary camera starting the imaging according to the embodiment of the present disclosure. As shown in FIG. 4, the following steps may be included:
  • Step 1 The controller of the mobile terminal starts the main camera.
  • Step 2 The controller of the mobile terminal receives the first frame of image data obtained by the main camera, and previews the image formed by the camera data, and there is a certain delay time from the start of the main camera to the main camera to obtain the preview of the image data.
  • Step 3 The controller delays the interval of the mobile terminal to start the secondary camera, the interval time is the difference between the second exposure time and the delay time, and the second exposure time is the reciprocal of the frame rate.
  • Step 4 The controller of the mobile terminal receives the first frame of image data obtained by the secondary camera, and previews the image formed by the camera data, and has a certain delay time from the start of the secondary camera to the secondary camera to obtain the preview of the captured data.
  • Step 5 The controller of the mobile terminal receives the second frame of image data obtained by the main camera.
  • Step 6 The controller of the mobile terminal stores the first frame of image data obtained by the main camera to a buffer, and marks it with a first mark, such as 0; the controller of the mobile terminal obtains the first frame obtained by the auxiliary camera.
  • the image data is stored to the buffer and marked with a second mark, such as 1.
  • Step 7 The controller of the mobile terminal receives the second frame of image data obtained by the secondary camera.
  • Step 8 The controller of the mobile terminal stores the second frame image data obtained by the main camera to the buffer, and marks it with a first mark, such as 0; the controller of the mobile terminal stores the second frame image data obtained by the auxiliary camera. To the buffer, and use the second flag, such as 1 for marking, the subsequent acquisition of the image data of each frame obtained by the main camera and the secondary camera in the buffer, and so on.
  • step 6 and step 5 or step 7 may have no obvious chronological order, as long as they are performed after step 3 and step 4.
  • FIG. 5 is a schematic diagram of the use of the buffer according to the embodiment of the present disclosure.
  • the buffer is configured to implement the writing and reading of the image data by the main camera and the auxiliary camera, and the implementation manner may include:
  • the method may include:
  • the pointer Pin starts to point to the head of the buffer. Whenever one frame of camera data is stored in the buffer, the pointer Pin is incremented by one.
  • the index of the buffer is an even number such as 0 and 2, and the image data obtained by the main camera is stored, and A mark, such as the number 0 mark.
  • the index of the buffer is an odd number such as 1, 3, and the image data obtained by the auxiliary camera is stored, and is marked with a second mark, such as the number 1.
  • the method may include:
  • the pointer Pout initially points to the head of the buffer, and the pointer Pout is incremented by one for each frame of image data taken in the buffer.
  • Pin may be set to: identify the location where the image data is written
  • Pout may be set to: identify the location at which the camera data is read out, and synchronization technology must be used.
  • FIG. 6 is a schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure. As shown in FIG. 6, the determining manner may include:
  • FIG. 7 is another schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure. As shown in FIG. 7, the determining manner may include:
  • w is the image of the camera Width
  • the processing may be performed according to the first cropping frame and the second cropping frame.
  • the determining process of the first cropping frame may include:
  • a coordinate system is established on the screen captured by the main camera, where the coordinate origin is the lower left corner of the screen, the width of the first crop frame is wd, and the height of the first crop frame is h, and the first crop frame is represented by coordinates. Is: (d, 0) (w, 0) (d, h) (w, h).
  • the determining process of the second cropping frame may include:
  • a coordinate system is established on the screen captured by the auxiliary camera, the coordinate origin is the lower left corner of the screen, the width of the second crop frame is wd, and the height of the second crop frame is h, and the second crop frame is represented by coordinates. It is: (0,0)(wd,0)(0,h)(wd,h).
  • the first crop frame and the second crop frame may be adjusted accordingly:
  • the coordinates of the first cropping frame are expressed as: (d ⁇ ratio, 0) (w ⁇ ratio, 0) (d ⁇ ratio, h ⁇ ratio) (w ⁇ ratio, h ⁇ ratio); the coordinates of the second cropping frame are expressed as : (0, 0) (w ⁇ ratio - d ⁇ ratio, 0) (0, h ⁇ ratio) (w ⁇ ratio - d ⁇ ratio, h ⁇ ratio).
  • Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed, implement a processing method in the above imaging.
  • Computer storage medium includes volatile and nonvolatile, implemented in any method or technology for storing information, such as computer readable instructions, data structures, program modules or other data. Sex, removable and non-removable media.
  • Computer storage media includes, but is not limited to, random access memory (RAM, Random Access Memory), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Flash or other memory technology, CD-ROM (CD-ROM) , Compact Disc Read-Only Memory), digital versatile disc (DVD) or other optical disc storage, magnetic box, magnetic tape, disk storage or other magnetic storage device, or any other device that can be used to store desired information and be accessible by a computer Medium.
  • communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and can include any information delivery media. .
  • the processing method and apparatus in the imaging determines a frame rate; determines a first exposure time and a second exposure time according to the frame rate, wherein the first exposure time is an exposure time of the main camera and the auxiliary camera; Determining, according to the second exposure time, an interval interval between the auxiliary camera and the start of the main camera; starting the main camera, delaying the interval, starting the auxiliary camera, and storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence.
  • the main camera and the auxiliary camera increase the imaging brightness by the first exposure time, and the second exposure time is used as the interval between the activation of the main camera and the auxiliary camera, so that high-speed imaging can be realized, which can improve the imaging quality of the high-speed imaging.

Abstract

A processing method in video recording, comprising: determining frame rate; determining a first exposure time and a second exposure time according to the frame rate, the first exposure time being the exposure time of a main camera and an auxiliary camera; determining an interval during which the auxiliary camera delays start of the main camera according to the second exposure time; and starting video recording of the main camera, delaying the interval, starting video recording of the auxiliary camera, and storing video recording data obtained for the main camera and the auxiliary camera respectively according to a recording sequence.

Description

一种摄像中的处理方法及装置Processing method and device in imaging 技术领域Technical field
本公开涉及但不限于成像技术领域,尤其是一种摄像中的处理方法及装置。The present disclosure relates to, but is not limited to, the field of imaging technology, and in particular, a processing method and apparatus in imaging.
背景技术Background technique
随着移动终端领域影像技术的不断进步与快速发展,移动终端的成像硬件、软件功能越来越完善、强大,相应产生了多种趣味性十足的成像应用,高速摄像就是其中一种,在电影场景中我们又称其为“子弹时间”。高速摄像是利用高帧率相机拍摄高速运动物体,由于帧率很高,可以抓取到大量的运动细节,在后期播放时由于采用通常的24FPS(帧/秒),视觉上给人造成一种慢动作的效果。With the continuous advancement and rapid development of image technology in the field of mobile terminals, the imaging hardware and software functions of mobile terminals are becoming more and more perfect and powerful, correspondingly generating a variety of interesting imaging applications. High-speed camera is one of them. In the scene we also call it "bullet time." High-speed camera is to shoot high-speed moving objects with high frame rate camera. Due to the high frame rate, a large amount of motion details can be captured. In the later playback, due to the usual 24FPS (frames/second), it is visually created. Slow motion effect.
发明内容Summary of the invention
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。The following is an overview of the topics detailed in this document. This Summary is not intended to limit the scope of the claims.
移动终端已经可以达到每秒240帧,在这种帧率下拍摄日常的运动场景完全足够了,可以在回放时达到比较好的慢动作效果。而且,移动终端在高速摄像时通过采用固定的成像传感器、固定的曝光时间参数,能够确保在不同环境中达到高帧率。The mobile terminal can already reach 240 frames per second, and it is enough to shoot daily motion scenes at this frame rate, which can achieve better slow motion effects during playback. Moreover, the mobile terminal can ensure a high frame rate in different environments by using a fixed imaging sensor and a fixed exposure time parameter at the time of high speed imaging.
但是,在移动终端上实现真正的子弹级别的高速摄像还有困难,由于实际帧率=1/(曝光时间+数据读取时间),因此,要达到帧率诸如240FPS,曝光时间和数据读取时间的总和时间约是4.2毫秒,由于总和时间中还包括数据读取时间,而数据读取时间需要维持在一个稳定数值,因此,当生成每帧数据时,留给移动终端中的成像传感器的曝光时间就会低于总和时间,所以要达到较高的帧率就需要较低的曝光时间。However, it is difficult to achieve real bullet-level high-speed imaging on a mobile terminal, since the actual frame rate = 1 / (exposure time + data read time), therefore, to achieve a frame rate such as 240 FPS, exposure time and data reading The sum time of the time is about 4.2 milliseconds, since the data reading time is also included in the total time, and the data reading time needs to be maintained at a stable value, and therefore, when each frame of data is generated, it is left to the imaging sensor in the mobile terminal. The exposure time will be lower than the sum time, so lower exposure times are required to achieve a higher frame rate.
另外,在光线亮度较暗的场景,由于曝光时间不能调整,移动终端上的 成像传感器获取的图像数据亮度偏低,直接导致生成的图像数据亮度不足,影响画质,严重影响了用户的使用体验。In addition, in a scene where the brightness of the light is dark, since the exposure time cannot be adjusted, the mobile terminal The brightness of the image data acquired by the imaging sensor is low, which directly leads to insufficient brightness of the generated image data, which affects the image quality and seriously affects the user experience.
本文提供一种摄像中的处理方法及装置,能够提升高速摄像的摄像质量。This paper provides a processing method and device for capturing images, which can improve the image quality of high-speed imaging.
本公开实施例提供一种摄像中的处理方法,所述方法包括:An embodiment of the present disclosure provides a processing method in imaging, where the method includes:
确定帧率;Determine the frame rate;
根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;Determining, according to the frame rate, a first exposure time and a second exposure time, the first exposure time being an exposure time of the main camera and the auxiliary camera;
根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;Determining, according to the second exposure time, an interval time during which the auxiliary camera delays starting the main camera;
启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。The main camera is started to be photographed, the interval time is delayed, the auxiliary camera is started, and the image data obtained by the main camera and the auxiliary camera are respectively stored in the order of imaging.
在一种示例性实施方式中,所述确定帧率,包括:In an exemplary embodiment, the determining a frame rate includes:
接收帧率选择操作,根据所述帧率选择操作确定当前摄像所使用的帧率。A reception frame rate selection operation is performed to determine a frame rate used for the current imaging based on the frame rate selection operation.
在一种示例性实施方式中,所述根据所述帧率确定第一曝光时间和第二曝光时间,包括:In an exemplary embodiment, the determining the first exposure time and the second exposure time according to the frame rate includes:
将所述帧率的倒数的两倍确定为第一曝光时间;Determining twice the reciprocal of the frame rate as the first exposure time;
将所述帧率的倒数确定为第二曝光时间。The reciprocal of the frame rate is determined as the second exposure time.
在一种示例性实施方式中,所述根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间,包括:In an exemplary embodiment, the determining, according to the second exposure time, an interval at which the secondary camera delays the activation of the main camera includes:
确定主摄像头的延迟时间,将所述第二曝光时间与所述延迟时间的差确定为间隔时间。A delay time of the main camera is determined, and a difference between the second exposure time and the delay time is determined as an interval time.
在一种示例性实施方式中,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据,包括:In an exemplary embodiment, the storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence respectively includes:
存储主摄像头得到的第i帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i帧摄像数据,并使用第二标记进行标记;然后,继续存储主摄像头得到的第i+1帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i+1帧摄像数据,并使用第二标记进行标记,依此类推;The ith frame image data obtained by the main camera is stored, and is marked with the first mark, and the ith frame image data obtained by the auxiliary camera is stored and marked with the second mark; then, the i+ obtained by the main camera is continuously stored. 1 frame of image data, and marked with the first mark, and then store the i+1th frame of image data obtained by the auxiliary camera, and mark with the second mark, and so on;
其中,所述i=1,2,…N,N为正整数。 Wherein, i=1, 2, . . . N, N is a positive integer.
在一种示例性实施方式中,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据之后,还包括:In an exemplary embodiment, after storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence, the method further includes:
对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。The stored image data is cropped to obtain coincidence image data of the main camera and the secondary camera.
在一种示例性实施方式中,所述对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据,包括:In an exemplary embodiment, the cropping the stored image data to obtain the coincidence image data of the main camera and the auxiliary camera includes:
根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;The ith frame image data obtained by the stored main camera is cropped according to the first crop frame, and the image data that is not coincident with the ith frame image data of the auxiliary camera is cropped;
并根据第二裁剪框对存储的辅摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;And cutting, according to the second cropping frame, the ith frame image data obtained by the stored auxiliary camera, and cutting out the image data that is not coincident with the ith frame image data of the main camera;
未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重合摄像数据;The camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera constitute a coincidence camera data;
其中,所述i=1,2,…N,N为正整数。Wherein, i=1, 2, . . . N, N is a positive integer.
在一种示例性实施方式中,所述第一裁剪框和第二裁剪框的确定,包括:In an exemplary embodiment, the determining of the first crop box and the second crop box includes:
根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;Determining the first cropping frame according to the screen size and the coincidence data width of the camera captured by the main camera;
根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
在一种示例性实施方式中,所述主摄像头的重合数据宽度的确定,包括:In an exemplary embodiment, the determination of the coincidence data width of the main camera includes:
将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the camera;
或,or,
所述辅摄像头的重合数据宽度的确定,包括:The determination of the coincidence data width of the auxiliary camera includes:
将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;The sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the secondary camera. The coincidence data width of the camera;
或,or,
所述主摄像头的重合数据宽度的确定,包括:将被摄对象在主摄像头上 的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;所述辅摄像头的重合数据宽度的确定,包括:将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The determination of the coincidence data width of the main camera includes: placing the subject on the main camera The sum of the imaging position to the right edge of the imaging screen of the main camera, the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the coincidence data width of the imaging of the main camera; The determination of the coincidence data width of the auxiliary camera includes: the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the imaging position of the subject on the main camera to the imaging screen of the main camera The sum of the distances of the right edges is determined as the coincidence data width of the auxiliary camera image.
在一种示例性实施方式中,所述得到主摄像头和辅摄像头的重合摄像数据之后,还包括:In an exemplary embodiment, after the obtaining the coincidence imaging data of the main camera and the auxiliary camera, the method further includes:
对所述重合摄像数据进行编码并显示。The coincidence imaging data is encoded and displayed.
本公开实施例还提供一种摄像中的处理装置,所述装置包括:The embodiment of the present disclosure further provides a processing device in imaging, the device comprising:
第一确定模块,设置为:确定帧率;a first determining module, configured to: determine a frame rate;
第二确定模块,设置为:根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;The second determining module is configured to: determine, according to the frame rate, a first exposure time and a second exposure time, where the first exposure time is an exposure time of the main camera and the auxiliary camera;
第三确定模块,设置为:根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;a third determining module, configured to: determine, according to the second exposure time, an interval time during which the auxiliary camera delays starting the main camera;
启动模块,设置为:启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。The startup module is set to: start the main camera to capture, delay the interval time, start the auxiliary camera to capture, and store the camera data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
在一种示例性实施方式中,所述装置,还包括:裁剪模块;In an exemplary embodiment, the apparatus further includes: a cropping module;
所述裁剪模块,设置为:对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。The cropping module is configured to: crop the stored image data to obtain coincidence image data of the main camera and the auxiliary camera.
在一种示例性实施方式中,所述装置,还包括:显示模块;In an exemplary embodiment, the device further includes: a display module;
所述显示模块,设置为:对所述重合摄像数据进行编码并显示。The display module is configured to: encode and display the coincidence imaging data.
在一种示例性实施方式中,所述裁剪模块,还设置为:根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。In an exemplary embodiment, the cropping module is further configured to: determine a first cropping frame according to a screen size and a coincidence data width of the image captured by the main camera; determine a first image according to a screen size and a coincidence data width of the auxiliary camera image capturing Two cropping frames.
在一种示例性实施方式中,所述裁剪模块,还设置为:In an exemplary embodiment, the cropping module is further configured to:
将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距 离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;或,The distance from the imaging position of the subject on the main camera to the right edge of the main camera's camera screen Determining the sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and determining the coincidence data width of the imaging of the primary camera; or
将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;或,The sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the secondary camera. The coincidence data width of the camera; or,
将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the imaging; the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera And, determine the coincidence data width for the auxiliary camera.
本公开实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被执行时实现上述摄像中的处理方法。Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed, implement a processing method in the above imaging.
本公开实施例提供的摄像中的处理方法及装置,确定帧率;根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。如此,主摄像头和辅摄像头通过第一曝光时间提升摄像亮度,通过第二曝光时间作为主摄像头和辅摄像头的启动的间隔时间,可实现高速摄像,这样能够提升高速摄像的摄像质量。The processing method and apparatus in the imaging provided by the embodiment of the present disclosure determines a frame rate; determines a first exposure time and a second exposure time according to the frame rate, wherein the first exposure time is an exposure time of the main camera and the auxiliary camera; Determining, according to the second exposure time, an interval interval between the auxiliary camera and the start of the main camera; starting the main camera, delaying the interval, starting the auxiliary camera, and storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence. In this way, the main camera and the auxiliary camera increase the imaging brightness by the first exposure time, and the second exposure time is used as the interval between the activation of the main camera and the auxiliary camera, so that high-speed imaging can be realized, which can improve the imaging quality of the high-speed imaging.
在阅读并理解了附图和详细描述后,可以明白其他方面。Other aspects will be apparent upon reading and understanding the drawings and detailed description.
附图概述BRIEF abstract
图1为本公开实施例的摄像中的处理方法的实现流程示意图;1 is a schematic flowchart showing an implementation process of a processing method in imaging according to an embodiment of the present disclosure;
图2为本公开实施例的摄像中的处理装置的示意图;2 is a schematic diagram of a processing device in imaging according to an embodiment of the present disclosure;
图3为本公开实施例提升高速摄像的摄像质量的实现流程示意图;FIG. 3 is a schematic diagram of an implementation process for improving image quality of high speed imaging according to an embodiment of the present disclosure; FIG.
图4为本公开实施例主摄像头和辅摄像头启动摄像的时序示意图;4 is a timing diagram of starting shooting of a main camera and a secondary camera according to an embodiment of the present disclosure;
图5为本公开实施例缓冲器的使用示意图; 5 is a schematic diagram of the use of a buffer according to an embodiment of the present disclosure;
图6为本公开实施例确定主摄像头的重合数据宽度和辅摄像头的重合数据宽度的一种示意图;6 is a schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure;
图7为本公开实施例确定主摄像头的重合数据宽度和辅摄像头的重合数据宽度的另一种示意图。FIG. 7 is another schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure.
本公开的较佳实施方式Preferred embodiment of the present disclosure
下面结合附图对本公开的实施方式进行描述。Embodiments of the present disclosure will be described below with reference to the accompanying drawings.
为了能够更加详尽地了解本公开实施例的特点与技术内容,下面结合附图对本公开实施例的实现进行详细阐述。In order to more fully understand the features and technical contents of the embodiments of the present disclosure, the implementation of the embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings.
如图1所示,本公开实施例提供一种摄像中的处理方法的实现流程,可以包括以下步骤:As shown in FIG. 1 , an embodiment of the present disclosure provides an implementation process of a processing method in imaging, which may include the following steps:
步骤101:确定帧率。Step 101: Determine a frame rate.
可选地,移动终端接收用户的帧率选择操作,根据用户输入的帧率选择操作确定当前摄像所使用的帧率。用户可以根据当前摄像环境的亮度选择帧率。其中,可供用户选择的帧率的取值可以有多个,例如240FPS、120FPS、60FPS。所述移动终端可以为手机、PDA(个人数字助理,Personal Digital Assistant)、PAD(平板电脑,Portable Android Device)等具有摄像功能的终端。Optionally, the mobile terminal receives a frame rate selection operation of the user, and determines a frame rate used by the current camera according to a frame rate selection operation input by the user. The user can select the frame rate according to the brightness of the current camera environment. The frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS. The mobile terminal may be a mobile phone, a PDA (Personal Digital Assistant), a PAD (Portable Android Device), or the like having a camera function.
步骤102:根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间。Step 102: Determine a first exposure time and a second exposure time according to the frame rate, where the first exposure time is an exposure time of a primary camera and a secondary camera.
这里,所述根据所述帧率确定第一曝光时间,可以包括:Here, the determining the first exposure time according to the frame rate may include:
移动终端将所述帧率的倒数的两倍确定为第一曝光时间。The mobile terminal determines twice the reciprocal of the frame rate as the first exposure time.
其中,所述第一曝光时间,作为主摄像头和辅摄像头的曝光时间,曝光时间越大,由主摄像头或辅摄像头摄像所得到的影像的亮度可以越高。Wherein, the first exposure time, as the exposure time of the main camera and the auxiliary camera, the greater the exposure time, the higher the brightness of the image obtained by the main camera or the auxiliary camera.
这里,所述根据所述帧率确定第二曝光时间,包括:Here, the determining the second exposure time according to the frame rate comprises:
移动终端将所述帧率的倒数确定为第二曝光时间。The mobile terminal determines the reciprocal of the frame rate as the second exposure time.
其中,所述第二曝光时间可以用于保证主摄像头和辅摄像头的高帧率摄 像。Wherein, the second exposure time can be used to ensure a high frame rate of the main camera and the auxiliary camera image.
步骤103:根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间。Step 103: Determine, according to the second exposure time, an interval time during which the secondary camera delays the startup of the main camera.
这里,所述根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间,可以包括:Here, the determining, according to the second exposure time, that the auxiliary camera delays the start of the main camera, may include:
移动终端确定主摄像头的延迟时间,将所述第二曝光时间与所述延迟时间的差确定为间隔时间。The mobile terminal determines a delay time of the main camera, and determines a difference between the second exposure time and the delay time as an interval time.
其中,所述延迟时间可以为主摄像头启动到主摄像头开始摄像的时间间隔。The delay time may be initiated by the main camera to a time interval when the main camera starts to capture.
步骤104:启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。Step 104: Start the main camera imaging, delay the interval time, start the auxiliary camera imaging, and store the imaging data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
这里,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据,可以包括:Here, the storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence may include:
移动终端存储主摄像头得到的第i帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i帧摄像数据,并使用第二标记进行标记;然后,继续存储主摄像头得到的第i+1帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i+1帧摄像数据,并使用第二标记进行标记,依此类推,其中,所述i=1,2,…N,N为正整数。The mobile terminal stores the ith frame image data obtained by the main camera, and marks the first frame, stores the ith frame image data obtained by the secondary camera, and marks the image with the second mark; and then continues to store the first camera. i+1 frame camera data, and marking with the first mark, storing the i+1th frame image data obtained by the auxiliary camera, and marking with the second mark, and so on, wherein the i=1, 2 ,...N,N is a positive integer.
这里,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据之后,还可以包括:对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。Here, after storing the imaging data obtained by the main camera and the secondary camera in the imaging sequence, the method may further include: cutting the stored imaging data to obtain overlapping imaging data of the primary camera and the secondary camera.
这里,所述对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据,可以包括:Here, the cropping the stored image data to obtain the coincidence image data of the main camera and the auxiliary camera may include:
移动终端根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;The mobile terminal performs cropping on the ith frame image data obtained by the stored main camera according to the first cropping frame, and crops the image data that is not coincident with the ith frame image data of the auxiliary camera;
并根据第二裁剪框对存储的辅摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;And cutting, according to the second cropping frame, the ith frame image data obtained by the stored auxiliary camera, and cutting out the image data that is not coincident with the ith frame image data of the main camera;
未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重 合摄像数据;The camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera are heavy. Camera data
其中,所述i=1,2,…N,N为正整数。Wherein, i=1, 2, . . . N, N is a positive integer.
其中,所述第一裁剪框和第二裁剪框的确定过程,可以包括:The determining process of the first cropping frame and the second cropping frame may include:
移动终端根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;The mobile terminal determines the first cropping frame according to the screen size and the coincidence data width of the camera captured by the main camera;
根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
这里,所述主摄像头的重合数据宽度的确定过程,可以包括:Here, the determining process of the coincidence data width of the main camera may include:
移动终端将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;The mobile terminal determines the distance between the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the imaging screen of the sub camera, and determines The coincidence data width of the main camera image;
或,or,
所述辅摄像头的重合数据宽度的确定过程,可以包括:The determining process of the coincidence data width of the auxiliary camera may include:
移动终端将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;The mobile terminal determines the distance between the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and determines Coincident data width of the auxiliary camera;
或,or,
所述主摄像头的重合数据宽度的确定过程,可以包括:移动终端将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;所述辅摄像头的重合数据宽度的确定过程,可以包括:移动终端将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The determining process of the coincidence data width of the main camera may include: the distance that the mobile terminal sets the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and the imaging position of the subject on the auxiliary camera The sum of the distances to the left edge of the camera screen of the secondary camera determines the coincidence data width of the camera capture; the process of determining the coincidence data width of the secondary camera may include: the image of the subject being on the secondary camera by the mobile terminal The sum of the distance from the position to the left edge of the imaging screen of the sub camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the coincidence data width of the sub camera imaging.
所述得到主摄像头和辅摄像头的重合摄像数据之后,还可以包括:对所述重合摄像数据进行编码;显示编码后的所述重合摄像数据。After obtaining the coincidence imaging data of the main camera and the auxiliary camera, the method further includes: encoding the coincidence imaging data; and displaying the encoded coincidence imaging data.
这里,所述对所述重合摄像数据进行编码,可以包括:Here, the encoding the coincidence imaging data may include:
基于视频编码的高压缩比参数,分别对主摄像头和辅摄像头的所述重合 摄像数据进行压缩;其中,所述视频编码所采用的技术可以包括但不限于是H264。The coincidence of the main camera and the auxiliary camera respectively based on the high compression ratio parameter of the video coding The camera data is compressed; wherein the techniques used for the video encoding may include, but are not limited to, H264.
为实现上述摄像中的处理方法,本公开实施例还提供了一种摄像中的处理装置,如图2所示,所述装置可以包括:第一确定模块21、第二确定模块22、第三确定模块23和启动模块24;其中,In order to achieve the processing method in the above imaging, the embodiment of the present disclosure further provides a processing device in the imaging. As shown in FIG. 2, the device may include: a first determining module 21, a second determining module 22, and a third Determining module 23 and starting module 24; wherein
第一确定模块21,设置为:确定帧率;The first determining module 21 is configured to: determine a frame rate;
第二确定模块22,设置为:根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;The second determining module 22 is configured to: determine, according to the frame rate, a first exposure time and a second exposure time, where the first exposure time is an exposure time of the main camera and the auxiliary camera;
第三确定模块23,设置为:根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;The third determining module 23 is configured to: determine, according to the second exposure time, an interval time during which the secondary camera delays the startup of the main camera;
启动模块24,设置为:启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。The startup module 24 is configured to: start the main camera imaging, delay the interval time, start the auxiliary camera imaging, and store the imaging data obtained by the main camera and the secondary camera respectively according to the imaging sequence.
可选地,所述第一确定模块21是设置为:接收用户的帧率选择操作,根据用户输入的帧率选择操作确定当前摄像所使用的帧率。用户可以根据当前摄像环境的亮度选择帧率。其中,可供用户选择的帧率的取值可以有多个,例如240FPS、120FPS、60FPS。Optionally, the first determining module 21 is configured to: receive a frame rate selection operation of the user, and determine a frame rate used by the current camera according to a frame rate selection operation input by the user. The user can select the frame rate according to the brightness of the current camera environment. The frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS.
可选地,所述第二确定模块22是设置为:将所述帧率的倒数的两倍确定为第一曝光时间。Optionally, the second determining module 22 is configured to determine twice the reciprocal of the frame rate as the first exposure time.
可选地,所述第二确定模块22是设置为:将所述帧率的倒数确定为第二曝光时间。Optionally, the second determining module 22 is configured to determine the reciprocal of the frame rate as the second exposure time.
可选地,所述第三确定模块23还设置为:确定主摄像头的延迟时间;可选地,所述第三确定模块23是设置为:将所述第二曝光时间与所述延迟时间的差确定为间隔时间。Optionally, the third determining module 23 is further configured to: determine a delay time of the main camera; optionally, the third determining module 23 is configured to: set the second exposure time and the delay time The difference is determined as the interval time.
所述启动模块24可以是设置为:存储主摄像头得到的第i帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i帧摄像数据,并使用第二标记进行标记;然后,继续存储主摄像头得到的第i+1帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i+1帧摄像数据,并使用第二标记进行标记,依此类推,其中,所述i=1,2,…N,N为正整数。 The startup module 24 may be configured to: store the ith frame image data obtained by the main camera, mark the first frame, store the ith frame image data obtained by the secondary camera, and mark the mark with the second mark; And continuing to store the i+1th frame image data obtained by the main camera, and marking with the first mark, storing the i+1th frame image data obtained by the auxiliary camera, and marking with the second mark, and so on, wherein , i = 1, 2, ... N, N is a positive integer.
所述装置,还可以包括裁剪模块,设置为:对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。The device may further include a cropping module configured to: trim the stored image data to obtain coincidence image data of the main camera and the auxiliary camera.
所述裁剪模块可以是设置为:根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;The cropping module may be configured to: crop the ith frame image data obtained by the stored main camera according to the first crop frame, and crop the image data that is not coincident with the ith frame image data of the auxiliary camera;
并根据第二裁剪框对存储的辅摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;And cutting, according to the second cropping frame, the ith frame image data obtained by the stored auxiliary camera, and cutting out the image data that is not coincident with the ith frame image data of the main camera;
未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重合摄像数据;The camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera constitute a coincidence camera data;
其中,所述i=1,2,…N,N为正整数。Wherein, i=1, 2, . . . N, N is a positive integer.
其中,所述裁剪模块,还可以设置为:根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;The cutting module may be further configured to: determine a first cropping frame according to a screen size and a coincidence data width of the camera captured by the main camera;
根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
所述裁剪模块,还可以设置为:将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;或,The cropping module may be further configured to: set a distance of an imaging position of the subject on the main camera to a right edge of the imaging screen of the main camera, and an imaging position of the subject on the auxiliary camera to an imaging screen of the auxiliary camera. The sum of the distances of the edges, determining the coincidence data width of the main camera image; or,
将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;或,The sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the secondary camera. The coincidence data width of the camera; or,
将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the imaging; the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera And, determine the coincidence data width for the auxiliary camera.
所述装置,还可以包括显示模块,设置为:对所述重合摄像数据进行编码;显示编码后的所述重合摄像数据。 The device may further include a display module configured to: encode the coincidence imaging data; and display the encoded coincidence imaging data.
所述显示模块可以是设置为:基于视频编码技术的高压缩比参数,分别对主摄像头和辅摄像头的所述重合摄像数据进行压缩;其中,所述视频编码技术可以包括但不限于是H264。The display module may be configured to: compress the coincidence imaging data of the main camera and the secondary camera respectively according to a high compression ratio parameter of the video coding technology; wherein the video coding technology may include, but is not limited to, H264.
在实际应用中,所述第一确定模块21、第二确定模块22、第三确定模块23、启动模块24、裁剪模块、显示模块均可由位于移动终端中的中央处理器(CPU,Central Processing Unit)、微处理器(MPU,Micro Processor Unit)、数字信号处理器(DSP,Digital Signal Processor)、或现场可编程门阵列(FPGA,Field Programmable Gate Array)等实现。In a practical application, the first determining module 21, the second determining module 22, the third determining module 23, the starting module 24, the cropping module, and the display module may all be configured by a central processing unit (CPU, Central Processing Unit) located in the mobile terminal. ), a microprocessor (MPU, Micro Processor Unit), a digital signal processor (DSP, Digital Signal Processor), or a Field Programmable Gate Array (FPGA).
下面以可选实施例详细说明本公开实施例方法的实现过程及原理。The implementation process and principle of the method of the embodiment of the present disclosure are described in detail below with an optional embodiment.
图3为本公开实施例提升高速摄像的摄像质量的实现流程示意图,如图3所示,可以包括如下步骤:FIG. 3 is a schematic diagram of an implementation process for improving the image quality of a high-speed camera according to an embodiment of the present disclosure. As shown in FIG. 3, the following steps may be included:
步骤301:移动终端进入高速摄像,判断是否接收到用户的帧率选择操作,如果接收到用户的帧率选择操作,则执行步骤302;否则,直接进行拍摄。Step 301: The mobile terminal enters high-speed imaging, and determines whether the frame rate selection operation of the user is received. If the frame rate selection operation of the user is received, step 302 is performed; otherwise, the shooting is directly performed.
用户可以根据当前摄像环境的亮度选择帧率。其中,可供用户选择的帧率的取值可以有多个,例如240FPS、120FPS、60FPS。The user can select the frame rate according to the brightness of the current camera environment. The frame rate that can be selected by the user may be multiple, for example, 240FPS, 120FPS, and 60FPS.
如果摄像环境的亮度不足,用户可以选择取值为240FPS的帧率进行拍摄;如果根据步骤307显示的拍摄画面的亮度依然不足,用户可以重新选择取值为120FPS的帧率进行拍摄;如果针对帧率取值为120FPS进行拍摄得到的步骤307中显示的拍摄画面的亮度依然不足,用户可以再次重新选择取值为60FPS的帧率进行拍摄。If the brightness of the shooting environment is insufficient, the user can select a frame rate of 240 FPS for shooting; if the brightness of the shooting picture displayed according to step 307 is still insufficient, the user can reselect the frame rate of 120 FPS for shooting; The brightness of the shooting screen displayed in step 307, which is obtained by taking the value of 120 FPS, is still insufficient, and the user can again select the frame rate of 60 FPS to perform shooting.
步骤302-303:移动终端根据接收到的用户输入的帧率选择操作,确定当前摄像所使用的帧率。Steps 302-303: The mobile terminal determines an frame rate used by the current camera according to the received frame rate selection operation of the user input.
移动终端可以将所述帧率的倒数的两倍确定为第一曝光时间,可以将所述帧率的倒数确定为第二曝光时间,所述第一曝光时间可以为主摄像头和辅摄像头的曝光时间,之后,执行步骤304。The mobile terminal may determine twice the reciprocal of the frame rate as the first exposure time, and may determine the reciprocal of the frame rate as the second exposure time, where the first exposure time may be an exposure of the primary camera and the secondary camera. After the time, step 304 is performed.
步骤304:移动终端确定主摄像头的延迟时间,将所述第二曝光时间与所述延迟时间的差确定为间隔时间,之后,执行步骤305。 Step 304: The mobile terminal determines a delay time of the main camera, and determines a difference between the second exposure time and the delay time as an interval time. Thereafter, step 305 is performed.
步骤305:移动终端、特定可以为移动终端的控制器,启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据,可选地,移动终端可以存储主摄像头得到的一帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的一帧摄像数据,并使用第二标记进行标记;然后,继续存储主摄像头得到的下一帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的下一帧摄像数据,并使用第二标记进行标记,依此类推,并且,分别存储主摄像头和辅摄像头得到的一帧摄像数据后,执行步骤306。Step 305: The mobile terminal, the controller that can be the mobile terminal, starts the main camera, delays the interval time, starts the auxiliary camera, and stores the camera data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence. Optionally, The mobile terminal can store one frame of image data obtained by the main camera, mark it with the first mark, store one frame of image data obtained by the auxiliary camera, and mark it with the second mark; then, continue to store the next obtained by the main camera. Frame camera data, and mark with the first mark, store the next frame of image data obtained by the secondary camera, mark with the second mark, and so on, and store one frame of the camera obtained by the main camera and the secondary camera respectively. After the data, step 306 is performed.
步骤306:移动终端根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;并根据第二裁剪框对存储的辅摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重合摄像数据;其中,所述i=1,2,…N,N为正整数。Step 306: The mobile terminal performs cropping on the ith frame image data obtained by the stored main camera according to the first cropping frame, and cuts out the camera data that is not coincident with the ith frame camera data of the secondary camera; and stores the image according to the second cropping frame. The ith frame image data obtained by the auxiliary camera is cropped, and the image data that is not coincident with the ith frame image data of the main camera is cropped; the image data obtained by the uncut main camera and the camera data obtained by the auxiliary camera are combined to form a camera image. Data; wherein, i = 1, 2, ... N, N is a positive integer.
其中,所述第一裁剪框和第二裁剪框的确定过程,可以包括:移动终端根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The determining process of the first cropping frame and the second cropping frame may include: determining, by the mobile terminal, the first cropping frame according to the screen size and the coincident data width captured by the main camera; and the screen size and the coincidence data according to the auxiliary camera Width, determine the second crop box.
所述主摄像头的重合数据宽度的确定过程,可以包括:移动终端将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;The determining process of the coincidence data width of the main camera may include: the distance that the mobile terminal sets the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera, and the imaging position of the subject on the auxiliary camera The sum of the distances to the left edge of the camera screen of the secondary camera determines the coincidence data width of the camera image captured by the main camera;
所述辅摄像头的重合数据宽度的确定过程,可以包括:移动终端将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The determining process of the coincidence data width of the auxiliary camera may include: the distance that the mobile terminal sets the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the imaging position of the subject on the main camera The sum of the distances to the right edge of the camera screen of the main camera is determined as the coincidence data width of the auxiliary camera image.
步骤307:移动终端基于视频编码的高压缩比参数,分别对主摄像头和辅摄像头的所述重合摄像数据进行压缩并显示压缩后的所述重合摄像数据。Step 307: The mobile terminal compresses the coincidence imaging data of the main camera and the secondary camera respectively according to the high compression ratio parameter of the video encoding, and displays the compressed coincidence imaging data.
图4为本公开实施例主摄像头和辅摄像头启动摄像的时序示意图,如图4所示,可以包括以下步骤: FIG. 4 is a timing diagram of the main camera and the auxiliary camera starting the imaging according to the embodiment of the present disclosure. As shown in FIG. 4, the following steps may be included:
步骤1:移动终端的控制器启动主摄像头。Step 1: The controller of the mobile terminal starts the main camera.
步骤2:移动终端的控制器接收主摄像头得到的第一帧摄像数据,并对摄像数据形成的影像进行预览,从主摄像头启动到主摄像头得到摄像数据形成预览存在一定的延迟时间。Step 2: The controller of the mobile terminal receives the first frame of image data obtained by the main camera, and previews the image formed by the camera data, and there is a certain delay time from the start of the main camera to the main camera to obtain the preview of the image data.
步骤3:移动终端的控制器延迟间隔时间启动辅摄像头,所述间隔时间为第二曝光时间与所述延迟时间的差,所述第二曝光时间为帧率的倒数。Step 3: The controller delays the interval of the mobile terminal to start the secondary camera, the interval time is the difference between the second exposure time and the delay time, and the second exposure time is the reciprocal of the frame rate.
步骤4:移动终端的控制器接收辅摄像头得到的第一帧摄像数据,并对摄像数据形成的影像进行预览,从辅摄像头启动到辅摄像头得到摄像数据形成预览存在一定的延迟时间。Step 4: The controller of the mobile terminal receives the first frame of image data obtained by the secondary camera, and previews the image formed by the camera data, and has a certain delay time from the start of the secondary camera to the secondary camera to obtain the preview of the captured data.
步骤5:移动终端的控制器接收主摄像头得到的第二帧摄像数据。Step 5: The controller of the mobile terminal receives the second frame of image data obtained by the main camera.
步骤6:移动终端的控制器将主摄像头得到的第一帧摄像数据存储至缓冲器(buffer),并使用第一标记、如0进行标记;移动终端的控制器将辅摄像头得到的第一帧摄像数据存储至缓冲器,并使用第二标记、如1进行标记。Step 6: The controller of the mobile terminal stores the first frame of image data obtained by the main camera to a buffer, and marks it with a first mark, such as 0; the controller of the mobile terminal obtains the first frame obtained by the auxiliary camera. The image data is stored to the buffer and marked with a second mark, such as 1.
步骤7:移动终端的控制器接收辅摄像头得到的第二帧摄像数据。Step 7: The controller of the mobile terminal receives the second frame of image data obtained by the secondary camera.
步骤8:移动终端的控制器将主摄像头得到的第二帧摄像数据存储至缓冲器,并使用第一标记、如0进行标记;移动终端的控制器将辅摄像头得到的第二帧摄像数据存储至缓冲器,并使用第二标记、如1进行标记,后续主摄像头和辅摄像头得到的每一帧摄像数据在缓冲器中的存储,依此类推。Step 8: The controller of the mobile terminal stores the second frame image data obtained by the main camera to the buffer, and marks it with a first mark, such as 0; the controller of the mobile terminal stores the second frame image data obtained by the auxiliary camera. To the buffer, and use the second flag, such as 1 for marking, the subsequent acquisition of the image data of each frame obtained by the main camera and the secondary camera in the buffer, and so on.
可以说明的是,步骤6与步骤5或步骤7可以没有明显的时间顺序,只要在步骤3和步骤4之后执行即可。It can be noted that step 6 and step 5 or step 7 may have no obvious chronological order, as long as they are performed after step 3 and step 4.
图5为本公开实施例缓冲器的使用示意图,如图5所示,缓冲器设置为:实现主摄像头和辅摄像头得到摄像数据的写入和读出,实现方式可以包括:FIG. 5 is a schematic diagram of the use of the buffer according to the embodiment of the present disclosure. As shown in FIG. 5, the buffer is configured to implement the writing and reading of the image data by the main camera and the auxiliary camera, and the implementation manner may include:
将主摄像头和辅摄像头得到摄像数据存入到buffer时,可以包括:When the camera data of the main camera and the auxiliary camera are stored in the buffer, the method may include:
指针Pin一开始指向buffer的头部,buffer中每存入一帧摄像数据时,指针Pin加一,缓冲器的索引(Index)为偶数如0、2处存储主摄像头得到的摄像数据,并用第一标记、如数字0标记。缓冲器的索引为奇数如1、3处存储辅摄像头得到的摄像数据,并用第二标记、如数字1标记。The pointer Pin starts to point to the head of the buffer. Whenever one frame of camera data is stored in the buffer, the pointer Pin is incremented by one. The index of the buffer is an even number such as 0 and 2, and the image data obtained by the main camera is stored, and A mark, such as the number 0 mark. The index of the buffer is an odd number such as 1, 3, and the image data obtained by the auxiliary camera is stored, and is marked with a second mark, such as the number 1.
将主摄像头和辅摄像头分别得到的摄像数据从缓冲器读出时,可以包括: When the image data obtained by the main camera and the auxiliary camera are read out from the buffer, the method may include:
指针Pout一开始指向buffer的头部,缓冲器中每取出一帧摄像数据时,指针Pout加一。The pointer Pout initially points to the head of the buffer, and the pointer Pout is incremented by one for each frame of image data taken in the buffer.
这里,Pin可以设置为:标识写入摄像数据的位置,Pout可以设置为:标识读出摄像数据的位置,必须使用同步技术。Here, Pin may be set to: identify the location where the image data is written, and Pout may be set to: identify the location at which the camera data is read out, and synchronization technology must be used.
图6为本公开实施例确定主摄像头的重合数据宽度和辅摄像头的重合数据宽度的一种示意图,如图6所示,确定方式可以包括:FIG. 6 is a schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure. As shown in FIG. 6, the determining manner may include:
建立坐标系,确定主摄像头的坐标原点为Oleft,辅摄像头的坐标原点为Oright,其中,left称为左,right称为右;法线方向为Y轴,向上为正,水平方向为X轴,向右为正;被摄对象(Object)在左侧主摄像头的成像位置为Xleft,被摄对象在右侧辅摄像头上的成像位置为Xright;主摄像头和辅摄像头的透镜主光轴之间的距离为T,主摄像头和辅摄像头的焦距均为f,被摄对象到主摄像头坐标原点和辅摄像头坐标原点连线的垂直距离为Z。Establish a coordinate system to determine that the coordinate origin of the main camera is O left , and the coordinate origin of the secondary camera is O right , where left is called left and right is called right; normal direction is Y axis, upward is positive, and horizontal is X The axis is positive to the right; the imaging position of the subject on the left main camera is X left , and the imaging position of the subject on the right auxiliary camera is X right ; the main lens of the main camera and the auxiliary camera The distance between the axes is T, the focal lengths of the main camera and the secondary camera are both f, and the vertical distance from the subject to the origin of the main camera coordinate origin and the auxiliary camera coordinate origin is Z.
图7为本公开实施例确定主摄像头的重合数据宽度和辅摄像头的重合数据宽度的另一种示意图,如图7所示,确定方式可以包括:FIG. 7 is another schematic diagram of determining a coincidence data width of a main camera and a coincidence data width of a secondary camera according to an embodiment of the present disclosure. As shown in FIG. 7, the determining manner may include:
将被摄对象在主摄像头上的成像位置Xleft到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置Xright到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头的重合数据宽度,用公式表示为:w/2-Xleft+(Xright-w/2)=w-(Xleft-Xright)=w-d,其中,w为摄像的画面宽度,d为主摄像头和辅摄像头对被摄对象成像的水平位移,这里所述主摄像头和辅摄像头是水平放置。The sum of the distance from the imaging position X left of the subject on the main camera to the right edge of the imaging screen of the main camera, and the distance from the imaging position X right of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, Determine the width of the coincidence data of the main camera, which is expressed as: w/2-X left +(X right -w/2)=w-(X left -X right )=wd, where w is the width of the image , d the horizontal displacement of the subject imaged by the main camera and the sub camera, where the main camera and the sub camera are placed horizontally.
将被摄对象在辅摄像头上的成像位置Xright到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置Xleft到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头的重合数据宽度,用公式表示为:Xright-(-w/2)+w/2-Xleft=w-(Xleft-Xright)=w-d,其中,w为摄像的画面宽度,d为主摄像头和辅摄像头对被摄对象成像的水平位移,这里所述主摄像头和辅摄像头是水平放置。a sum of a distance from the imaging position X right of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position X left of the subject on the main camera to the right edge of the imaging screen of the main camera, The width of the coincidence data determined as the secondary camera is expressed as: X right -(-w/2)+w/2-X left =w-(X left -X right )=wd, where w is the image of the camera Width, d The horizontal displacement of the subject imaged by the primary camera and the secondary camera, where the primary camera and the secondary camera are placed horizontally.
对数据进行裁剪时可以依据第一裁剪框和第二裁剪框进行处理,这里,第一裁剪框的确定过程,可包括: When the data is cropped, the processing may be performed according to the first cropping frame and the second cropping frame. Here, the determining process of the first cropping frame may include:
根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;Determining the first cropping frame according to the screen size and the coincidence data width of the camera captured by the main camera;
可选地,在主摄像头摄像的画面建立坐标系,坐标原点为所述画面的左下角,第一裁剪框的宽为w-d,第一裁剪框的高为h,则第一裁剪框用坐标表示为:(d,0)(w,0)(d,h)(w,h)。Optionally, a coordinate system is established on the screen captured by the main camera, where the coordinate origin is the lower left corner of the screen, the width of the first crop frame is wd, and the height of the first crop frame is h, and the first crop frame is represented by coordinates. Is: (d, 0) (w, 0) (d, h) (w, h).
第二裁剪框的确定过程,可包括:The determining process of the second cropping frame may include:
根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框;Determining a second cropping frame according to a screen size and a coincidence data width of the auxiliary camera imaging;
可选地,在辅摄像头摄像的画面建立坐标系,坐标原点为所述画面的左下角,第二裁剪框的宽为w-d,第二裁剪框的高为h,则第二裁剪框用坐标表示为:(0,0)(w-d,0)(0,h)(w-d,h)。Optionally, a coordinate system is established on the screen captured by the auxiliary camera, the coordinate origin is the lower left corner of the screen, the width of the second crop frame is wd, and the height of the second crop frame is h, and the second crop frame is represented by coordinates. It is: (0,0)(wd,0)(0,h)(wd,h).
如果主摄像头和辅摄像头对摄像有尺寸缩放,缩放因数为ratio,则第一裁剪框和第二裁剪框可以相应调整为:If the main camera and the secondary camera have size scaling for the camera and the zoom factor is ratio, the first crop frame and the second crop frame may be adjusted accordingly:
第一裁剪框的坐标表示为:(d×ratio,0)(w×ratio,0)(d×ratio,h×ratio)(w×ratio,h×ratio);第二裁剪框的坐标表示为:(0,0)(w×ratio-d×ratio,0)(0,h×ratio)(w×ratio-d×ratio,h×ratio)。The coordinates of the first cropping frame are expressed as: (d×ratio, 0) (w×ratio, 0) (d×ratio, h×ratio) (w×ratio, h×ratio); the coordinates of the second cropping frame are expressed as : (0, 0) (w × ratio - d × ratio, 0) (0, h × ratio) (w × ratio - d × ratio, h × ratio).
本公开实施例还提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令被执行时实现上述摄像中的处理方法。Embodiments of the present disclosure also provide a computer readable storage medium storing computer executable instructions that, when executed, implement a processing method in the above imaging.
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、系统、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些组件或所有组件可以被实施为由处理器,如数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于随机存取存储器(RAM, Random Access Memory)、只读存储器(ROM,Read-Only Memory)、电可擦除只读存储器(EEPROM,Electrically Erasable Programmable Read-only Memory)、闪存或其他存储器技术、光盘只读存储器(CD-ROM,Compact Disc Read-Only Memory)、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。Those of ordinary skill in the art will appreciate that all or some of the steps, systems, and functional blocks/units of the methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical The components work together. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on a computer readable medium, which may include computer storage media (or non-transitory media) and communication media (or transitory media). As is well known to those of ordinary skill in the art, the term computer storage medium includes volatile and nonvolatile, implemented in any method or technology for storing information, such as computer readable instructions, data structures, program modules or other data. Sex, removable and non-removable media. Computer storage media includes, but is not limited to, random access memory (RAM, Random Access Memory), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Flash or other memory technology, CD-ROM (CD-ROM) , Compact Disc Read-Only Memory), digital versatile disc (DVD) or other optical disc storage, magnetic box, magnetic tape, disk storage or other magnetic storage device, or any other device that can be used to store desired information and be accessible by a computer Medium. Moreover, it is well known to those skilled in the art that communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and can include any information delivery media. .
本领域的普通技术人员可以理解,可以对本公开的技术方案进行修改或者等同替换,而不脱离本公开技术方案的精神和范围,均应涵盖在本公开的权利要求范围当中。A person skilled in the art can understand that the technical solutions of the present disclosure may be modified or equivalent, without departing from the spirit and scope of the present disclosure, and should be included in the scope of the claims of the present disclosure.
工业实用性Industrial applicability
本公开实施例提供的摄像中的处理方法及装置,确定帧率;根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。如此,主摄像头和辅摄像头通过第一曝光时间提升摄像亮度,通过第二曝光时间作为主摄像头和辅摄像头的启动的间隔时间,可实现高速摄像,这样能够提升高速摄像的摄像质量。 The processing method and apparatus in the imaging provided by the embodiment of the present disclosure determines a frame rate; determines a first exposure time and a second exposure time according to the frame rate, wherein the first exposure time is an exposure time of the main camera and the auxiliary camera; Determining, according to the second exposure time, an interval interval between the auxiliary camera and the start of the main camera; starting the main camera, delaying the interval, starting the auxiliary camera, and storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence. In this way, the main camera and the auxiliary camera increase the imaging brightness by the first exposure time, and the second exposure time is used as the interval between the activation of the main camera and the auxiliary camera, so that high-speed imaging can be realized, which can improve the imaging quality of the high-speed imaging.

Claims (16)

  1. 一种摄像中的处理方法,所述方法包括:A processing method in imaging, the method comprising:
    确定帧率;Determine the frame rate;
    根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;Determining, according to the frame rate, a first exposure time and a second exposure time, the first exposure time being an exposure time of the main camera and the auxiliary camera;
    根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;Determining, according to the second exposure time, an interval time during which the auxiliary camera delays starting the main camera;
    启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。The main camera is started to be photographed, the interval time is delayed, the auxiliary camera is started, and the image data obtained by the main camera and the auxiliary camera are respectively stored in the order of imaging.
  2. 根据权利要求1所述的方法,其中,所述确定帧率,包括:The method of claim 1 wherein said determining a frame rate comprises:
    接收帧率选择操作,根据所述帧率选择操作确定当前摄像所使用的帧率。A reception frame rate selection operation is performed to determine a frame rate used for the current imaging based on the frame rate selection operation.
  3. 根据权利要求1所述的方法,其中,所述根据所述帧率确定第一曝光时间和第二曝光时间,包括:The method of claim 1, wherein the determining the first exposure time and the second exposure time based on the frame rate comprises:
    将所述帧率的倒数的两倍确定为第一曝光时间;Determining twice the reciprocal of the frame rate as the first exposure time;
    将所述帧率的倒数确定为第二曝光时间。The reciprocal of the frame rate is determined as the second exposure time.
  4. 根据权利要求1所述的方法,其中,所述根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间,包括:The method according to claim 1, wherein the determining, according to the second exposure time, an interval at which the secondary camera delays the activation of the main camera comprises:
    确定主摄像头的延迟时间,将所述第二曝光时间与所述延迟时间的差确定为间隔时间。A delay time of the main camera is determined, and a difference between the second exposure time and the delay time is determined as an interval time.
  5. 根据权利要求1所述的方法,其中,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据,包括:The method according to claim 1, wherein the storing the image data obtained by the main camera and the auxiliary camera in the imaging sequence respectively comprises:
    存储主摄像头得到的第i帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i帧摄像数据,并使用第二标记进行标记;然后,继续存储主摄像头得到的第i+1帧摄像数据,并使用第一标记进行标记,再存储辅摄像头得到的第i+1帧摄像数据,并使用第二标记进行标记,依此类推;The ith frame image data obtained by the main camera is stored, and is marked with the first mark, and the ith frame image data obtained by the auxiliary camera is stored and marked with the second mark; then, the i+ obtained by the main camera is continuously stored. 1 frame of image data, and marked with the first mark, and then store the i+1th frame of image data obtained by the auxiliary camera, and mark with the second mark, and so on;
    其中,所述i=1,2,…N,N为正整数。Wherein, i=1, 2, . . . N, N is a positive integer.
  6. 根据权利要求1所述的方法,所述按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据之后,还包括: The method of claim 1, after storing the image data obtained by the main camera and the auxiliary camera in the order of imaging, respectively, further comprising:
    对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。The stored image data is cropped to obtain coincidence image data of the main camera and the secondary camera.
  7. 根据权利要求6所述的方法,其中,所述对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据,包括:The method according to claim 6, wherein the cropping the stored image data to obtain coincidence image data of the main camera and the auxiliary camera comprises:
    根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;The ith frame image data obtained by the stored main camera is cropped according to the first crop frame, and the image data that is not coincident with the ith frame image data of the auxiliary camera is cropped;
    并根据第二裁剪框对存储的辅摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;And cutting, according to the second cropping frame, the ith frame image data obtained by the stored auxiliary camera, and cutting out the image data that is not coincident with the ith frame image data of the main camera;
    未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重合摄像数据;The camera data obtained by the uncut main camera and the camera data obtained by the auxiliary camera constitute a coincidence camera data;
    其中,所述i=1,2,…N,N为正整数。Wherein, i=1, 2, . . . N, N is a positive integer.
  8. 根据权利要求7所述的方法,其中,The method of claim 7 wherein
    所述第一裁剪框和第二裁剪框的确定,包括:The determining of the first cropping frame and the second cropping frame includes:
    根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;Determining the first cropping frame according to the screen size and the coincidence data width of the camera captured by the main camera;
    根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The second cropping frame is determined according to the screen size and the coincidence data width of the sub camera.
  9. 根据权利要求8所述的方法,其中,The method of claim 8 wherein
    所述主摄像头的重合数据宽度的确定,包括:The determination of the coincidence data width of the main camera includes:
    将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the camera;
    或,or,
    所述辅摄像头的重合数据宽度的确定,包括:The determination of the coincidence data width of the auxiliary camera includes:
    将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;The sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the secondary camera. The coincidence data width of the camera;
    或,or,
    所述主摄像头的重合数据宽度的确定,包括:将被摄对象在主摄像头上 的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;所述辅摄像头的重合数据宽度的确定,包括:将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。The determination of the coincidence data width of the main camera includes: placing the subject on the main camera The sum of the imaging position to the right edge of the imaging screen of the main camera, the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the coincidence data width of the imaging of the main camera; The determination of the coincidence data width of the auxiliary camera includes: the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the imaging position of the subject on the main camera to the imaging screen of the main camera The sum of the distances of the right edges is determined as the coincidence data width of the auxiliary camera image.
  10. 根据权利要求7所述的方法,所述得到主摄像头和辅摄像头的重合摄像数据之后,还包括:The method of claim 7, after the obtaining the coincidence imaging data of the main camera and the auxiliary camera, further comprising:
    对所述重合摄像数据进行编码;Encoding the coincident camera data;
    显示编码后的所述重合摄像数据。The encoded coincidence imaging data is displayed.
  11. 一种摄像中的处理装置,所述装置包括:A processing device in imaging, the device comprising:
    第一确定模块,设置为:确定帧率;a first determining module, configured to: determine a frame rate;
    第二确定模块,设置为:根据所述帧率确定第一曝光时间和第二曝光时间,所述第一曝光时间为主摄像头和辅摄像头的曝光时间;The second determining module is configured to: determine, according to the frame rate, a first exposure time and a second exposure time, where the first exposure time is an exposure time of the main camera and the auxiliary camera;
    第三确定模块,设置为:根据所述第二曝光时间确定辅摄像头延后主摄像头启动的间隔时间;a third determining module, configured to: determine, according to the second exposure time, an interval time during which the auxiliary camera delays starting the main camera;
    启动模块,设置为:启动主摄像头摄像,延迟所述间隔时间,启动辅摄像头摄像,按摄像顺序分别存储主摄像头和辅摄像头得到的摄像数据。The startup module is set to: start the main camera to capture, delay the interval time, start the auxiliary camera to capture, and store the camera data obtained by the main camera and the auxiliary camera respectively according to the imaging sequence.
  12. 根据权利要求11所述的装置,所述装置,还包括:裁剪模块;The apparatus of claim 11, the apparatus further comprising: a cropping module;
    所述裁剪模块,设置为:对存储的所述摄像数据进行裁剪,得到主摄像头和辅摄像头的重合摄像数据。The cropping module is configured to: crop the stored image data to obtain coincidence image data of the main camera and the auxiliary camera.
  13. 根据权利要求12所述的装置,所述装置,还包括:显示模块;The device of claim 12, the device further comprising: a display module;
    所述显示模块,设置为:对所述重合摄像数据进行编码;显示编码后的所述重合摄像数据。The display module is configured to: encode the coincidence imaging data; and display the encoded coincidence imaging data.
  14. 根据权利要求12所述的装置,其中,所述裁剪模块是设置为:根据第一裁剪框对存储的主摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与辅摄像头的第i帧摄像数据重合的摄像数据;并根据第二裁剪框对存储的辅 摄像头得到的第i帧摄像数据进行裁剪,裁剪掉未与主摄像头的第i帧摄像数据重合的摄像数据;未裁剪掉的主摄像头得到的摄像数据和辅摄像头得到的摄像数据组成重合摄像数据;其中,所述i=1,2,…N,N为正整数。The apparatus according to claim 12, wherein the cropping module is configured to: crop the ith frame image data obtained by the stored main camera according to the first crop frame, and crop the ith frame image that is not associated with the auxiliary camera Data coincides with the camera data; and according to the second crop box The ith frame image data obtained by the camera is cropped, and the image data that is not coincident with the ith frame image data of the main camera is cropped; the image data obtained by the uncut main camera and the image data obtained by the auxiliary camera are combined to form the image data; Wherein, i=1, 2, . . . N, N is a positive integer.
  15. 根据权利要求14所述的装置,所述裁剪模块,还设置为:根据主摄像头摄像的画面尺寸和重合数据宽度,确定第一裁剪框;根据辅摄像头摄像的画面尺寸和重合数据宽度,确定第二裁剪框。The apparatus according to claim 14, wherein the cropping module is further configured to: determine a first cropping frame according to a screen size and a coincidence data width of the image captured by the main camera; and determine a first image according to a screen size and a coincidence data width of the auxiliary camera image Two cropping frames.
  16. 根据权利要求15所述的装置,The device of claim 15
    所述裁剪模块,还设置为:The cutting module is further configured to:
    将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;或,The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the camera; or,
    将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度;或,The sum of the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera is determined as the secondary camera. The coincidence data width of the camera; or,
    将被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离、与被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离的和,确定为主摄像头摄像的重合数据宽度;将被摄对象在辅摄像头上的成像位置到辅摄像头的摄像画面左边缘的距离、与被摄对象在主摄像头上的成像位置到主摄像头的摄像画面右边缘的距离的和,确定为辅摄像头摄像的重合数据宽度。 The sum of the distance from the imaging position of the subject on the main camera to the right edge of the camera screen of the main camera, and the distance from the imaging position of the subject on the sub camera to the left edge of the camera screen of the sub camera is determined as the main camera. The coincidence data width of the imaging; the distance from the imaging position of the subject on the secondary camera to the left edge of the imaging screen of the secondary camera, and the distance from the imaging position of the subject on the main camera to the right edge of the imaging screen of the main camera And, determine the coincidence data width for the auxiliary camera.
PCT/CN2017/098515 2016-12-07 2017-08-22 Processing method in video recording and apparatus WO2018103371A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611115175.2A CN108174112B (en) 2016-12-07 2016-12-07 Processing method and device in camera shooting
CN201611115175.2 2016-12-07

Publications (1)

Publication Number Publication Date
WO2018103371A1 true WO2018103371A1 (en) 2018-06-14

Family

ID=62490603

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/098515 WO2018103371A1 (en) 2016-12-07 2017-08-22 Processing method in video recording and apparatus

Country Status (2)

Country Link
CN (1) CN108174112B (en)
WO (1) WO2018103371A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113784048B (en) * 2019-08-01 2023-09-19 深圳市道通智能航空技术股份有限公司 Camera imaging method, camera system and unmanned aerial vehicle
CN111726543B (en) * 2020-06-30 2022-12-09 杭州萤石软件有限公司 Method and camera for improving dynamic range of image
CN115460355B (en) * 2022-08-31 2024-03-29 青岛海信移动通信技术有限公司 Image acquisition method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278965A1 (en) * 2008-05-07 2009-11-12 Ko Kuk-Won High-speed photographing apparatus using plural cameras
US20140232904A1 (en) * 2013-02-21 2014-08-21 Samsung Electronics Co., Ltd. Combining effective images in electronic device having a plurality of cameras
CN104363374A (en) * 2014-11-28 2015-02-18 广东欧珀移动通信有限公司 High-frame-rate video generation method and device and terminal
CN106161943A (en) * 2016-07-29 2016-11-23 维沃移动通信有限公司 A kind of kinescope method and mobile terminal
CN106210584A (en) * 2016-08-02 2016-12-07 乐视控股(北京)有限公司 A kind of video recording method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100574376C (en) * 2006-03-27 2009-12-23 精工爱普生株式会社 Camera head, camera system and image capture method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090278965A1 (en) * 2008-05-07 2009-11-12 Ko Kuk-Won High-speed photographing apparatus using plural cameras
US20140232904A1 (en) * 2013-02-21 2014-08-21 Samsung Electronics Co., Ltd. Combining effective images in electronic device having a plurality of cameras
CN104363374A (en) * 2014-11-28 2015-02-18 广东欧珀移动通信有限公司 High-frame-rate video generation method and device and terminal
CN106161943A (en) * 2016-07-29 2016-11-23 维沃移动通信有限公司 A kind of kinescope method and mobile terminal
CN106210584A (en) * 2016-08-02 2016-12-07 乐视控股(北京)有限公司 A kind of video recording method and device

Also Published As

Publication number Publication date
CN108174112B (en) 2020-11-13
CN108174112A (en) 2018-06-15

Similar Documents

Publication Publication Date Title
US9813607B2 (en) Method and apparatus for image capture targeting
US10367997B2 (en) Enriched digital photographs
JP6388673B2 (en) Mobile terminal and imaging method thereof
US20120057786A1 (en) Image processing apparatus, image processing method, image pickup apparatus, and storage medium storing image processing program
US9426369B2 (en) Imaging device
US20110007187A1 (en) Imaging Device And Image Playback Device
JP2016535552A (en) Method and apparatus for obtaining a photograph
KR20160128366A (en) Mobile terminal photographing method and mobile terminal
TW201617717A (en) Method and system for stabilization and reframing
KR20130112574A (en) Apparatus and method for improving quality of enlarged image
KR102004990B1 (en) Device and method of processing images
US20160295111A1 (en) Image processing apparatus that combines images
EP3316568B1 (en) Digital photographing device and operation method therefor
WO2018103371A1 (en) Processing method in video recording and apparatus
CN105635568B (en) Image processing method and mobile terminal in a kind of mobile terminal
CN110072058A (en) Image capturing device, method and terminal
JP2006303961A (en) Imaging apparatus
CN110072059A (en) Image capturing device, method and terminal
CN107071277B (en) Optical drawing shooting device and method and mobile terminal
JP2014050022A (en) Image processing device, imaging device, and program
US11190670B2 (en) Method and a system for processing images based a tracked subject
JP2018137797A (en) Imaging apparatus, imaging method and program
CN105530426A (en) Image capturing apparatus, control method thereof, and storage medium
CN108965686A (en) The method and device taken pictures
US9955135B2 (en) Image processing apparatus, image processing method, and program wherein a RAW image to be subjected to special processing is preferentially subjected to development

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17878966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17878966

Country of ref document: EP

Kind code of ref document: A1