WO2020134123A1 - 全景拍摄方法及装置、相机、移动终端 - Google Patents
全景拍摄方法及装置、相机、移动终端 Download PDFInfo
- Publication number
- WO2020134123A1 WO2020134123A1 PCT/CN2019/101489 CN2019101489W WO2020134123A1 WO 2020134123 A1 WO2020134123 A1 WO 2020134123A1 CN 2019101489 W CN2019101489 W CN 2019101489W WO 2020134123 A1 WO2020134123 A1 WO 2020134123A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- offset
- subject
- image
- panoramic
- panoramic image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000003384 imaging method Methods 0.000 claims abstract description 32
- 238000013507 mapping Methods 0.000 claims abstract description 32
- 230000015654 memory Effects 0.000 claims description 18
- 238000012545 processing Methods 0.000 claims description 15
- 238000012937 correction Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 11
- 230000003796 beauty Effects 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 239000011159 matrix material Substances 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Definitions
- This application relates to, but is not limited to, the field of photography, and in particular, to a panoramic shooting method and device, camera, and mobile terminal.
- the human eye can observe scenes with a large field of view, and the current terminal camera has a relatively small field of view angle due to the lens specifications, so that the camera can not capture the angle of view of the human eyes at one shot.
- Panoramic photographing is to solve this problem.
- the photographer is the center, and the handheld photographing device is rotated to shoot, and the photographs are taken from multiple angles and finally stitched into a panoramic photograph.
- This solves the problem of a camera with a small field of view taking a picture with a large field of view, but there is currently no way to solve the problem of deformation caused by stitching panoramic photos.
- the focus is on the control method of panoramic exposure in panoramic shooting to ensure that the exposure of the images to be stitched is consistent, or the focus is on using two or more cameras to ensure that the images can be stitched reasonably, but there is a problem.
- Due to the great viewing angle of the panorama on the one hand, it is affected by lens distortion, and on the other hand, the splicing points of the image are mostly at the edge, so that different subjects at different distances will have different degrees of distortion and deformation on the plane. Affect the user experience, and there is no effective solution to this problem in related technologies.
- the embodiments of the present application provide a panoramic shooting method and device, a camera, and a mobile terminal, so as to at least solve the problem of large distortion of objects in a panoramic shot image in the related art.
- a panoramic shooting method including: acquiring depth information of a subject in panoramic shooting; acquiring a panoramic view of the subject in accordance with a preset mapping relationship between depth and offset Corresponding offset in the image; correct the panoramic image of the subject according to the offset.
- a panoramic shooting device including: a first acquisition module configured to acquire depth information of a photographed object in panoramic shooting; a second acquisition module configured to be based on a preset depth and The mapping relationship of the offsets obtains the offset corresponding to the subject in the panoramic image; the first correction module is set to correct the panoramic image of the subject according to the offset.
- a camera including: a distance measuring device configured to acquire depth information of a subject in panoramic shooting; a processor configured to map according to a preset depth and offset The relationship acquires an offset corresponding to the subject in the panoramic image; and is used to correct the panoramic image of the subject according to the offset.
- a mobile terminal including: a distance measuring device, configured to acquire depth information of a subject in panoramic shooting; and a processor, configured to follow a preset depth and offset To obtain the offset corresponding to the subject in the panoramic image; and to correct the panoramic image of the subject according to the offset.
- a storage medium in which a computer program is stored, wherein the computer program is configured to execute the steps in any one of the above method embodiments during runtime.
- an electronic device including a memory and a processor, the memory stores a computer program, the processor is configured to run the computer program to perform any of the above The steps in the method embodiment.
- FIG. 1 is a block diagram of a hardware structure of a mobile terminal of a panoramic shooting method according to an embodiment of the present application
- FIG. 2 is a flowchart of a panoramic shooting method according to an embodiment of the present application.
- FIG. 3 is a schematic structural diagram of a terminal according to another example of the present application.
- FIG. 4 is a flowchart of correcting a panoramic shot image according to another embodiment of the present application.
- FIG. 5 is a schematic diagram of a coordinate system according to another example of the present application.
- FIG. 6 is a schematic diagram of an imaging plane of a camera according to another embodiment of the present application.
- the embodiments in this application document can be used in the scene of panoramic shooting, and can be used in mobile phones, tablet computers, and cameras that support panoramic shooting in the related art.
- FIG. 1 is a hardware block diagram of a mobile terminal of a panoramic shooting method according to an embodiment of the present application.
- the mobile terminal may include one or more (only shown in FIG. 1).
- a processor 102 (the processor 102 may include but is not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 configured to store data.
- the mobile terminal may further include a setting The transmission device 106 and the input/output device 108 which are communication functions.
- FIG. 1 is merely an illustration, which does not limit the structure of the mobile terminal described above.
- the mobile terminal may also include more or fewer components than those shown in FIG. 1, or have a different configuration from that shown in FIG.
- the memory 104 may be configured to store software programs and modules of application software, such as program instructions/modules corresponding to the panoramic shooting method in the embodiments of the present application, and the processor 102 executes each program by running the software programs and modules stored in the memory 104 A variety of functional applications and data processing, that is to achieve the above method.
- the memory 104 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
- the memory 104 may further include memories remotely provided with respect to the processor 102, and these remote memories may be connected to the mobile terminal through a network. Examples of the aforementioned network include, but are not limited to, the Internet, intranet, local area network, mobile communication network, and combinations thereof.
- the transmission device 106 is configured to receive or transmit data via a network.
- the above-mentioned specific example of the network may include a wireless network provided by a communication provider of a mobile terminal.
- the transmission device 106 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices through the base station to communicate with the Internet.
- the transmission device 106 may be a radio frequency (Radio Frequency) module, which is configured to communicate with the Internet in a wireless manner.
- Radio Frequency Radio Frequency
- FIG. 2 is a flowchart of the panoramic shooting method according to an embodiment of the present application. As shown in FIG. 2, the process includes the following steps:
- Step S202 Obtain depth information of the subject in panoramic shooting
- the depth information here is a term in the shooting field, which can be understood as the distance between the current object and the imaging plane.
- This step S202 may be performed by a mobile terminal with a ranging function, or by a third-party device attached to the mobile terminal, all of which are within the scope of protection of this application.
- Step S204 Obtain the offset corresponding to the subject in the panoramic image according to the preset mapping relationship between the depth and the offset;
- the offset also refers to the calibration amount of distortion, and the offset of the corresponding point after calibration.
- the mapping relationship can be pre-stored in the form of a table, and the corresponding offset can be obtained by directly looking up the table later.
- the following effects are achieved in the form of a table. If the complex functional relationship is fitted, the calculation amount is large or the calculation time is not real-time, the corresponding table is made using the fitting result, the distance is entered, and the calibration amount is found Table, so that during calibration, the form of table lookup plus interpolation is used to calibrate to improve time efficiency.
- Step S206 correct the panoramic image of the subject according to the offset.
- the original imaging position of the subject in the panoramic image is shifted according to the offset amount, or specifically the pixel points of the imaging of the subject are shifted.
- the correspondence between the depth and the calibration amount is to first obtain the "object-image" data obtained from the calibration chart with distance information, and use the calculation method described in another embodiment of the present application to obtain the corresponding functional relationship Or table information, use it as the preset depth and calibration information.
- the role of this preset information is to obtain the distance information of the subject through the terminal's ranging function when taking pictures on the terminal, and to obtain the calibration amount in combination with the preset information obtained above.
- the mapping relationship is established in the following way: panoramic shooting of a known scene , According to the positional relationship between different objects in the known scene, determine the ideal position of objects of different depths in the shooting screen; according to the difference between the ideal position of each object and the actual position in the shooting screen, determine the Offset; according to the offset of each object and the depth information of each object, the mapping relationship is established.
- the mapping relationship is established by taking pictures of different known scenes multiple times, such as a regular picture card, etc., the correspondence between different depth information and offsets is obtained, so that the established mapping relationship includes as many correspondences as possible. relationship.
- the offset of each object can be more specifically understood as the offset of each pixel of the image area of the object.
- determining the ideal position of objects of different depths in the shooting screen includes: obtaining the distance ratio between different objects in the known scene; Obtain the ideal position of each object in the shooting screen according to the distance ratio. The distance ratio between different objects in the known scene is mapped to the shooting screen. The distance ratio between different objects in the shooting screen should be consistent with the known scene. In this way, the ideal of each object is determined The position is then compared with the original imaging position of each object to obtain the offset of each object.
- the distortion calibration parameters of the camera module lens of the camera before correcting the panoramic image of the subject according to the offset, obtain the distortion calibration parameters of the camera module lens of the camera; calibrate the panoramic image according to the distortion calibration parameters .
- the camera is a camera that currently captures a panoramic image, can be integrated into a mobile terminal such as a mobile phone, and can be an independent camera.
- the distortion of the camera module before performing the process in FIG. 2, the distortion of the camera module can be calibrated to recover the ideal position of the object more accurately.
- correcting the panoramic image of the subject according to the offset includes: acquiring the original imaging position of the subject in the panoramic image; and converting the original imaging position according to the offset Move by the amount.
- the corrected image is acquired as the first image, and the region to be interpolated in the first image is detected according to a preset rule; and the region to be interpolated is interpolated according to pixels in a preset range around the region to be interpolated.
- the image data is processed, and interpolation is performed where the imaging points need to be interpolated.
- the algorithm uses the correlation between the pixels around the pixel to be processed to calculate the value of the pixel to be processed, thus solving the problem of insufficient processing details and smoothness of the conventional interpolation algorithm.
- detecting the region to be interpolated in the first image according to a preset rule includes at least one of the following: detecting a joint between frame data in the first image, and using the joint as the An area to be interpolated; detecting pixels in the first image whose offset is greater than a threshold, and using the pixel as the area to be interpolated.
- the solution of another embodiment of the present application may be applied to a panoramic shooting scene including a distance measuring function device.
- the main purpose is to improve the quality of panoramic shooting and enhance the user experience.
- Another example of this application includes the following two parts:
- the first part uses depth information for panoramic correction.
- the terminal takes a panoramic photo, first use the terminal's ranging module to obtain the depth/distance information of the subject, and then use the known mapping relationship or obtain the offset of the corresponding image point to be calibrated to process the image data.
- the correlation of the image is used to process the image data to be calibrated. Process image data, and interpolate where imaging points need to be interpolated.
- the algorithm uses the correlation between the pixels around the pixel to be processed to calculate the value of the pixel to be processed, thus solving the problem of insufficient processing details and smoothness of the conventional interpolation algorithm.
- FIG. 3 is a schematic structural diagram of a terminal according to another example of the present application.
- the terminal includes a storage EEPROM, a camera, a ranging module, a data processing module, and a display module.
- the terminal device photographs the scene through the camera device, acquires the image, and processes the image data.
- the data processing here can be either a data processing chip or software-implemented processing. Because the panoramic photos are the result of the stitching and synthesis of one frame of photos, the early calibration and later calibration of the panoramic photos can be decomposed into single-frame photos.
- FIG. 4 is a flowchart of correcting a panoramic shot image according to another embodiment of the present application. As shown in FIG. 4, it includes the following steps:
- Step 1 Calibrate the distortion of the camera module lens.
- a series of picture cards or real scenes containing position information are used to determine the calibration relationship.
- the picture cards are not limited to checkerboards, concentric circles, bitmaps and other subjects. Assuming that the pixel coordinates of the actual imaging are (x”y”), the pixel coordinates after calibration (x'yy) are expressed as follows,
- k 1 , k 2 , and k 3 are radial distortion calibration parameters
- p 1 and p 2 are tangential distortion calibration parameters.
- the corresponding pixel data can be fitted with the corresponding functional relationship by using the least square method.
- Step 2 Determine the functional relationship between the object point of the subject and the ideal imaging point.
- n represents the image card or real scene at the n sets of distances that need to be calibrated.
- the value of n can be adjusted appropriately according to the lens parameters or the calculation amount and imaging effect. In application, the actual distance is not in the sequence of the value, using the adjacent distance value interpolation calculation.
- FIG. 5 is a schematic diagram of a coordinate system according to another example of the present application
- FIG. 6 is a schematic diagram of a camera imaging plane according to another embodiment of the present application, as shown in FIGS. 5 and 6, assuming that at a distance Z, a certain subject
- the position information of one calibration point is represented by the coordinate P in the world coordinate system: P(X,Y,Z), converted to the homogeneous coordinate system is represented as: P(X,Y,Z,1), the world coordinate system is converted to the camera
- the transformation of the coordinate system is expressed in matrix form as follows:
- Other parameters can be obtained according to the least square method of the information corresponding to the calibration point.
- M P can determine the parameters in the matrix according to the data of the calibration point and the projection model, and the projection model is adjusted according to the data of the panoramic calibration point.
- the unit of the camera plane coordinate system coordinate is m, but when using the camera to capture images, we read pixel coordinates, so further coordinate conversion is required, as follows:
- M s is also called the internal reference matrix of the camera, where (x” y”) contains the distance information after being transformed homogeneously, expressed as the actual pixel coordinates at the distance Z.
- Step 3 While taking a panoramic photo, use the terminal's ranging module to test the depth information of the subject to obtain the coordinates (XYZ), load the data in the EEPROM, and calculate the calibrated coordinates (xy) if the distance is actually measured Not within the previously calibrated distance sequence (Z 1 ... Z n ), first determine which two sequences the current distance is between, then load the calibration parameters at these two distances, and then calculate the corresponding calibration matrix by interpolation, Finally, the corresponding pixel coordinates are obtained.
- simplified processing can be done appropriately, collecting dense calibration data for nearby objects, increasing the collection interval for objects that are farther away, or not performing calibration for parts that are farther away without affecting subjective effects.
- (Z 1 ... Z n ), (X Y Z) and their corresponding calibrated pixel coordinates at different distances can be directly stored as a table, and subsequent calibration can be realized by looking up the table.
- Step 4 Use image correlation to interpolate the final imaging.
- the nodes in the calibration graph are evenly or regularly distributed as the calibration data, there is no processing for the pixels distributed between the calibration points, so interpolation processing is required.
- interpolation is also required.
- the actual imaging point p image coordinate (x, y), the calibration imaging point p'image coordinate (x+ ⁇ x, y+ ⁇ y), when the calibrated image coordinate (x+ ⁇ x, y+ ⁇ y) When it is not integer, it is necessary to interpolate to obtain the pixel value of the corresponding integer position.
- the offset position ( ⁇ x, ⁇ y)>1 it means that pixel points need to be added between the actual image coordinate (x, y) and the calibrated image coordinate (x+ ⁇ x, y+ ⁇ y).
- the correlation between the pixel point to be supplemented and the surrounding pixels is used here to obtain the pixel value of the position to be processed.
- the pixel at the position to be solved is p i
- p j is the pixel value in the adjacent threshold of the actual pixel p
- the adjacent threshold v k1*max( ⁇ x, ⁇ y), k1 ⁇ 1
- w(i,j) is the correlation weight of p i and p j :
- Dis(i, j) is the Euclidean distance between p i and p j
- k 2 and k 3 are adjustable parameters.
- Step five the calibration is completed, and the corrected panoramic image is saved.
- Another embodiment of the present application implements algorithms that belong to a built-in system.
- the camera application interface may prompt "Panorama taking pictures”.
- the above solution can be used to improve the image quality of panoramic shooting, and can also be used to monitor 360° scene video.
- the method according to the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course, it can also be implemented by hardware, but in many cases the former is Better implementation.
- the technical solution of the present application can essentially be embodied in the form of software products, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk,
- the CD-ROM includes several instructions to enable a terminal device (which may be a mobile phone, computer, server, or network device, etc.) to execute the methods described in the embodiments of the present application.
- a panoramic shooting device is also provided.
- the device is configured to implement the above-mentioned embodiments and preferred implementation modes, and those that have already been described will not be repeated.
- the term "module” may implement a combination of software and/or hardware with predetermined functions.
- the devices described in the following embodiments are preferably implemented in software, implementation of hardware or a combination of software and hardware is also possible and conceived.
- a panoramic shooting device including:
- the first obtaining module is configured to obtain the depth information of the subject in the panoramic shooting
- the second acquiring module is configured to acquire the offset corresponding to the subject in the panoramic image according to the preset mapping relationship between the depth and the offset;
- the first correction module is configured to correct the panoramic image of the subject according to the offset.
- the device further includes a establishing module, which is set to obtain before the second acquiring module acquires the offset corresponding to the subject in the panoramic image according to the preset mapping relationship between the depth and the offset , To establish the mapping relationship by:
- the second determining unit is set to determine the offset of each object according to the difference between the ideal position of each object and the actual position in the shooting screen;
- the establishing unit is configured to establish the mapping relationship according to the offset of each object and the depth information of each object.
- the first determining unit is further configured to obtain a distance ratio between different objects in the known scene; and to obtain an ideal position of each object in the shooting screen according to the distance ratio.
- the first correction module is further configured to acquire distortion calibration parameters of the camera module lens of the camera; and It is used to calibrate the panoramic image according to the distortion calibration parameter.
- the first correction module is configured to acquire the original imaging position of the subject in the panoramic image; and to move the original imaging position according to the offset.
- the first correction module is further configured to acquire the corrected image as the first image and detect according to a preset rule Out the region to be interpolated in the first image;
- the first correction module is further configured to detect a joint between frame data in the first image, and use the joint as the region to be interpolated;
- it is further configured to detect a pixel point in the first image whose offset is greater than a threshold value, and use the pixel point as the region to be interpolated.
- the above modules can be implemented by software or hardware, and the latter can be implemented by the following methods, but not limited to this: the above modules are all located in the same processor; or, the above modules can be combined in any combination The forms are located in different processors.
- a camera including:
- the distance measuring device is set to obtain the depth information of the subject in the panoramic shooting
- a processor configured to obtain an offset corresponding to the subject in the panoramic image according to a preset mapping relationship between the depth and the offset; and set to determine the offset of the subject according to the offset The panoramic image is corrected.
- a mobile terminal including:
- the distance measuring device is set to obtain the depth information of the subject in the panoramic shooting
- a processor configured to obtain an offset corresponding to the subject in the panoramic image according to a preset mapping relationship between the depth and the offset; and set to determine the offset of the subject according to the offset The panoramic image is corrected.
- the camera and the mobile terminal in the third embodiment may be devices that execute the process in FIG. 2.
- the embodiments of the present application also provide a storage medium.
- the above storage medium may be set to store program code for performing the following steps:
- the above storage medium may include, but is not limited to: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic Various media that can store program codes, such as a disc or an optical disc.
- An embodiment of the present application further provides an electronic device, including a memory and a processor, where the computer program is stored in the memory, and the processor is configured to run the computer program to perform the steps in any one of the foregoing method embodiments.
- the electronic device may further include a transmission device and an input-output device, where the transmission device is connected to the processor, and the input-output device is connected to the processor.
- the above processor may be configured to perform the following steps through a computer program:
- modules or steps of the present application can be implemented by a general-purpose computing device, they can be concentrated on a single computing device, or distributed in a network composed of multiple computing devices Above, optionally, they can be implemented with program code executable by the computing device, so that they can be stored in the storage device to be executed by the computing device, and in some cases, can be in a different order than here
- the steps shown or described are performed, or they are made into individual integrated circuit modules respectively, or multiple modules or steps among them are made into a single integrated circuit module to achieve. In this way, this application is not limited to any specific combination of hardware and software.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
本申请提供了一种全景拍摄方法及装置、相机、移动终端,其中,该方法包括:在进行全景拍摄时,首先通过测距设备获取被摄物体的深度信息,即被摄物体与拍照设备的距离,然后通过查询预设的映射关系获取与当前距离对应的偏移量,后续相应在全景图像将被摄物体的原始成像进行偏移,形成校正后的全景图像。采用上述方案,解决了相关技术中全景拍摄的图像中物体扭曲较大的问题,精确高效地校正了物体在全景图像上的扭曲,保证了全景成像的美感。
Description
本申请涉及但不限于摄影领域,具体而言,涉及一种全景拍摄方法及装置、相机、移动终端。
在相关技术中,人眼可以观察到大视场角的场景,而当前的终端摄像头由于镜头规格所限视场角都相对较小,这样摄像头的一次成像并不能拍摄出符合人双眼视角的,全景拍照就是为了解决这个问题,一般是由拍摄者为中心,手持拍照设备旋转拍摄,通过多个角度拍摄最终拼接成全景照片。这样就解决了视场角较小的相机拍大视场角风景等画面问题,但拼接全景照片造成的形变问题当前并没有方法解决。
在相关技术中,侧重于在全景拍摄中对全景曝光的控制方法,保证在要拼接图像的曝光一致,或者是是侧重于用2个及以上的摄像头保证图像能够合理拼接,但是存在一个问题,由于全景的可视角度极大,一方面受镜头畸变影响,另一方面图像的拼接点多在边缘,这样远近不同的被摄对象,在平面上成的像会有不同程度的扭曲和形变,影响用户体验,而相关技术中对此问题并没有提出有效的解决方法。
针对相关技术中全景拍摄的图像中物体扭曲较大的问题,目前还没有有效的解决方案。
发明内容
本申请实施例提供了一种全景拍摄方法及装置、相机、移动终端,以至少解决相关技术中全景拍摄的图像中物体扭曲较大的问题。
根据本申请的一个实施例,提供了一种全景拍摄方法,包括:获取全景拍摄中被摄物体的深度信息;依据预置的深度与偏移量的映射关系获取 与所述被摄物体在全景图像中对应的偏移量;按照所述偏移量对所述被摄物体的所述全景图像进行校正。
根据本申请的另一个实施例,提供了一种全景拍摄装置,包括:第一获取模块,设置为获取全景拍摄中被摄物体的深度信息;第二获取模块,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;第一校正模块,设置为按照所述偏移量对所述被摄物体的所述全景图像进行校正。
根据本申请的另一个实施例,提供了一种相机,包括:测距装置,设置为获取全景拍摄中被摄物体的深度信息;处理器,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;以及用于按照所述偏移量对所述被摄物体的所述全景图像进行校正。
根据本申请的另一个实施例,还提供了一种移动终端,包括:测距装置,设置为获取全景拍摄中被摄物体的深度信息;处理器,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;以及用于按照所述偏移量对所述被摄物体的所述全景图像进行校正。
根据本申请的又一个实施例,还提供了一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
根据本申请的又一个实施例,还提供了一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行上述任一项方法实施例中的步骤。
通过本申请的上述实施例,在进行全景拍摄时,首先通过测距设备获取被摄物体的深度信息,即被摄物体与拍照设备的距离,然后通过查询预设的映射关系获取与当前距离对应的偏移量,后续相应在全景图像将被摄物体的原始成像进行偏移,形成校正后的全景图像。采用上述方案,解决了相关技术中全景拍摄的图像中物体扭曲较大的问题,精确高效地校正了 物体在全景图像上的扭曲,保证了全景成像的美感。
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是本申请实施例的一种全景拍摄方法的移动终端的硬件结构框图;
图2是根据本申请实施例的全景拍摄方法的流程图;
图3是根据本申请另一个实例的终端的结构示意图;
图4是根据本申请另一个实施例的校正全景拍摄图像的流程图;
图5是根据本申请另一个实例的坐标系示意图;
图6是根据本申请另一个实施例的相机成像平面示意图。
下文中将参考附图并结合实施例来详细说明本申请。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
本申请文件中的实施例可以用于全景拍摄的场景中,可以用于相关技术中的手机,平板电脑,支持全景拍摄的相机。
实施例一
本申请实施例一所提供的方法实施例可以在移动终端、计算机终端或者类似的运算装置中执行。以运行在移动终端上为例,图1是本申请实施例的一种全景拍摄方法的移动终端的硬件结构框图,如图1所示,移动终端可以包括一个或多个(图1中仅示出一个)处理器102(处理器102可 以包括但不限于微处理器MCU或可编程逻辑器件FPGA等的处理装置)和设置为存储数据的存储器104,可选地,上述移动终端还可以包括设置为通信功能的传输装置106以及输入输出设备108。本领域普通技术人员可以理解,图1所示的结构仅为示意,其并不对上述移动终端的结构造成限定。例如,移动终端还可包括比图1中所示更多或者更少的组件,或者具有与图1所示不同的配置。
存储器104可设置为存储应用软件的软件程序以及模块,如本申请实施例中的全景拍摄方法对应的程序指令/模块,处理器102通过运行存储在存储器104内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的方法。存储器104可包括高速随机存储器,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器104可进一步包括相对于处理器102远程设置的存储器,这些远程存储器可以通过网络连接至移动终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
传输装置106设置为经由一个网络接收或者发送数据。上述的网络具体实例可包括移动终端的通信供应商提供的无线网络。在一个实例中,传输装置106包括一个网络适配器(Network Interface Controller,NIC),其可通过基站与其他网络设备相连从而可与互联网进行通讯。在一个实例中,传输装置106可以为射频(Radio Frequency,RF)模块,其设置为通过无线方式与互联网进行通讯。
在本实施例中提供了一种运行于上述移动终端的全景拍摄方法,图2是根据本申请实施例的全景拍摄方法的流程图,如图2所示,该流程包括如下步骤:
步骤S202,获取全景拍摄中被摄物体的深度信息;
此处的深度信息为拍摄领域的用语,可以理解为当前被摄物体与成像面的距离。该步骤S202可以由具备测距功能的移动终端来执行,或者,由移动终端附带的第三方设备来执行,均在本申请保护范围。
步骤S204,依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;
偏移量也指畸变的校准量,校准后对应点的偏移量。
该映射关系可以以表格形式预先存储,后续直接查表即可获得对应的偏移量。采用表格的形式实现以下效果,如果是拟合出的复杂的函数关系,计算量量较大或者计算时间实时性不佳时,利用拟合结果制成对应的表格,输入的距离,查找校准量的表,这样在校正时,采用查表加插值的形式校准提升时间效率。
步骤S206,按照所述偏移量对所述被摄物体的所述全景图像进行校正。
可以是将被摄物体在全景图像中的原始成像位置,依据该偏移量进行偏移,或者具体到对被摄物体的成像的像素点进行偏移。
在实际校正过程中,可以对重要的被摄物体进行校正,例如人像等,对次要的被摄物体不进行校正,例如较远的天空等,可以节省大量算力。
通过上述步骤,在进行全景拍摄时,首先通过测距设备获取被摄物体的深度信息,即被摄物体与拍照设备的距离,然后通过查询预设的映射关系获取与当前距离对应的偏移量,后续相应在全景图像将被摄物体的原始成像进行偏移,形成校正后的全景图像。采用上述方案,解决了相关技术中全景拍摄的图像中物体扭曲较大的问题,精确高效地校正了物体在全景图像上的扭曲,保证了全景成像的美感。
可选地,深度与校准量的对应关系,是要先通过有距离信息的标定图卡得到的“物-像”数据,采用后续本申请另一个实施例记载的计算方法,得到对应的函数关系或者表信息,将其作为预置深度与校准量的信息。接下来,这个预置信息的作用是,在终端拍照时候,先通过终端的测距类功能得到被摄物的距离信息,结合上面得到的预置信息,得到校准量。
可选地,依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量之前,通过以下方式建立所述映射关系:对已知 场景进行全景拍摄,依据所述已知场景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置;依据每个物体的理想位置与拍摄画面中实际位置的区别,确定每个物体的偏移量;依据所述每个物体的偏移量,和每个物体的深度信息,建立所述映射关系。采用该方案,通过多次对不同的已知场景进行拍摄,例如规律的图卡等,获取到不同深度信息与偏移量的对应关系,使得建立的映射关系中尽可能包括更多条目的对应关系。每个物体的偏移量可以更为具体的理解为该物体所述图像区域的每个像素点的偏移量。
可选地,依据所述已知场景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置,包括:获取所述已知场景中的不同物体之间的距离比例;依据所述距离比例获取每个物体在拍摄画面中的理想位置。将已知场景中的不同物体之间的距离比例,对应到拍摄画面上,拍摄画面中的不同物体之间的距离比例应当是与已知场景一致的,通过这种方式确定每个物体的理想位置,然后与每个物体的原始成像位置进行对比,得到每个物体的偏移量。
可选地,按照所述偏移量对所述被摄物体的所述全景图像进行校正之前,获取相机的摄像模组镜头的畸变校准参数;依据所述畸变校准参数对所述全景图像进行校准。该相机为当前拍摄全景图像的相机,可以集成于手机等移动终端中,可以是独立的相机。采用该方案,在执行图2中的流程之前,可以先对摄像头模组的畸变进行校准,以更为准确的恢复被摄物体的理想位置。
可选地,按照所述偏移量对所述被摄物体的所述全景图像进行校正,包括:获取所述被摄物体在全景图像中原始成像位置;将所述原始成像位置按照所述偏移量进行移动。
可选地,按照所述偏移量对所述被摄物体的所述全景图像进行校正之后,
获取校正后的图像为第一图像,依据预设规则检测出所述第一图像中 待插值区域;依据所述待插值区域周边预设范围的像素,对所述待插值区域进行插值处理。采用该方案,处理图像数据,对成像点需要插值的地方做插值处理。算法利用待处理像素点周边像素的相关性关系来计算待处理像素点的值,从而解决了常规插值算法在处理细节和平滑性不足的问题。
可选地,依据预设规则检测出所述第一图像中待插值区域,包括以下至少之一:检测所述第一图像中的帧数据之间的拼接处,将所述拼接处作为所述待插值区域;检测所述第一图像中经过偏移的偏移量大于阈值的像素点,将所述像素点作为所述待插值区域。采用该方案,可以重点在容易发生图像变形的位置进行平滑性处理,例如帧数据之间的拼接处,经过图2流程处理后的偏移较大的位置等。
下面结合本申请另一个实施例进行说明。
本申请另一个实施例的方案可以应用于含有测距功能装置的全景拍摄场景。主要目的是提高全景拍摄质量,提升用户体验。本申请另一个实例中的包括以下两部分:
第一部分,利用深度信息进行全景校正。预先利用已知距离的图卡或者实景,得到被摄对象、拍摄距离和像平面中实际像点坐标,理想像点坐标的数据,用数据拟合或最小二乘法等确定被摄对象和理想像点坐标之间的映射关系,以此确立不同距离下被摄对象实际像素位置和对应的偏移量。在终端拍摄全景照片时,首先利用终端的测距模块得到被摄对象的深度/距离信息,再利用已知的映射关系或者得到对应像点需要校准的偏移量来处理图像数据。
第二部分,利用图像的相关性处理待校准的图像数据。处理图像数据,对成像点需要插值的地方做插值处理。算法利用待处理像素点周边像素的相关性关系来计算待处理像素点的值,从而解决了常规插值算法在处理细节和平滑性不足的问题。
图3是根据本申请另一个实例的终端的结构示意图,如图3所示,该终端包括存储EEPROM,相机Camera,测距模块,数据处理模块,显示 模块。终端设备通过摄像头装置,对场景进行拍摄,获取图像将图像数据处理。这里的数据处理即可以是数据处理芯片也可以是软件实现的处理。全景的照片由于是一帧帧照片的拼接合成的结果,因此对全景照片的前期标定和后期的校准都可以分解成单帧的照片处理。
图4是根据本申请另一个实施例的校正全景拍摄图像的流程图,如图4所示,包括以下步骤:
S401,进入全景拍摄;
S402,初始化摄像;
S403,拍摄全景,并计算深度信息;
S404,加载深度校准数据;
S405,根据当前深度数据,校正所需图像数据;
S406,保存图片。
下面是对全景拍摄图像进行处理的步骤流程,包括以下步骤:
步骤一,对摄像模组镜头的畸变进行校准。
首先利用一系列含位置信息的图卡或者实景进行校准关系的确定,其中图卡不限于棋盘格,同心圆,点阵图等被摄对象。假设实际成像的像素坐标为(x” y”),校准后的像素坐标(x' y')表示如下,
其中k
1,k
2,k
3为径向畸变校准参数,p
1,p
2为切向畸变校准参数。对应的像素数据用最小二乘法等可以拟合出对应的函数关系。
步骤二,确定被摄对象物点和理想成像点之间的函数关系。
以成像面为基准面,被摄对象距此面的距离表示如下:
(Z
1,Z
2,...,Z
n)
其中n表示需要做标定的n组距离下的图卡或者实景,n值可以根据镜头参数或者计算量及成像效果等做适当调整。在应用时,实际距离不在序列中的数值,用相邻距离值插值计算。
图5是根据本申请另一个实例的坐标系示意图,图6是根据本申请另一个实施例的相机成像平面示意图,如图5和图6所示,假设在距离Z下,被摄物的某个标定点位置信息在世界坐标系中用坐标P表示:P(X,Y,Z),转换为齐次坐标系表示为:P(X,Y,Z,1),世界坐标系转换到相机坐标系的转化用矩阵形式表示如下:
P
c(X
c,Y
c,Z
c,1)=P(X,Y,Z,1)M
w
其中M
w可以实现坐标的平移,旋转,缩放等组合变换,一般取a
14,b
14,c
14=0,d
14=1。其他参数可以根据对应标定点的信息最小二乘法获得。
由于相机成像是将三维信息投影到二维成像面的过程,这一过程可用投影变换来实现,可以表示如下:
P
p(X
p,Y
p,Z
p,1)=P
c(X
c,Y
c,Z
c,1)M
p
M
P可以根据标定点的数据和投影模型来确定矩阵中的参数,投影模型根据全景标定点的数据做调整。
相机平面坐标系坐标的单位是m,但使用相机拍摄的影像时我们读取的是像素坐标,所以还需要进一步的坐标转换,如下:
M
s也称为相机的内参矩阵,其中(x” y”)齐次变换后,含有距离信息,表示为在距离Z下的实际的像素坐标。合并以上矩阵乘法,可以得到综合变换矩阵M,
将(x” y”)再经过由第一步镜头畸变校准的函数,可以得到校准后的图像像素坐标(x' y'),
(x' y')再根据实际成像效果做进一步调整,得到最终的理想像素坐标(x y)。
通过不同距离(Z
1 ... Z
n)下所标定的被摄对象坐标(X Y Z)和对应理想像素坐标(x y),可以得到不同距离下系列的综合变换矩阵(M
1 ... M
n),将对应的变换矩阵和畸变校准参数存到EEPROM中,待全景拍照时调用。
步骤三,拍摄全景照片的同时,利用终端的测距模块,测试出被摄对象的深度信息,得到坐标(X Y Z),加载EEPROM中的数据,运算得到 校准后的坐标(x y),如果实测距离不在此前标定的距离序列(Z
1 ... Z
n)内,先判断当前距离在哪两个序列之间,再加载这两个距离下的校准参数,然后通过插值计算出对应的校准矩阵,最后得到对应的像素坐标。
为减少计算量,可以适当做简化处理,对近处物体采集较密集的校准数据,对距离较远的物体可以适当增大采集的间隔或者对不影响主观效果距离较远部分不做校准。
为减少计算时间,可以直接将不同距离下(Z
1 ... Z
n),(X Y Z)及其对应的校准后的像素坐标,存储成表格,后续的校准利用查表实现。
步骤四,利用图像相关性对最终成像做插值处理。一方面,由于使用标定图中均匀或者规律分布的节点作为校准数据的来计算,则对分布在标定点之间的像素点,没有处理,因此需要插值处理。另一方面,针对镜头畸变较大的情况,也需要做插值。
设Z距离处,实际成像点p图像坐标(x,y),校准后成像点p'图像坐标(x+Δx,y+Δy),当校准后的图像坐标(x+Δx,y+Δy)非整时,需要插值得到对应的整数位置的像素值。当偏移位置(Δx,Δy)>1,说明实际图像坐标(x,y)和校准后的图像坐标(x+Δx,y+Δy)之间需要补充像素点。跟常规的插值不同,这里利用要补充的像素点跟四周像素的相关性,得到待处理位置像素值。
假设实际像素p,待求解位置的像素为p
i,p
j为实际像素p邻阈中的像素值,定义邻阈v=k1*max(Δx,Δy),k1≥1,则
w(i,j)为p
i和p
j相关性权重:
其中,Dis(i,j)为p
i和p
j之间的欧式距离,k
2,k
3为可调整参数。
Dis(i,j)=||p
i-p
j||
步骤五,校正完成,保存校正后的全景图像。
本申请另一个实施例实现的是属于内置系统算法。没有独立的用户界面。主要基于终端相机camera拍摄实现的。在使用上述功能的情况下,camera应用界面可以提示“全景拍照中”。
采用上述方案,可以在终端全景拍摄时,改善畸变后者拍摄视场角过大引起的扭曲等,提升全景拍照效果,提升用户体验。
上述方案可以用于全景拍摄的画质提升中,也可以用于360°场景视频监控。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
实施例二
在本实施例中还提供了一种全景拍摄装置,该装置设置为实现上述实施例及优选实施方式,已经进行过说明的不再赘述。如以下所使用的,术 语“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
根据本申请的另一个实例实施例,提供了一种全景拍摄装置,包括:
第一获取模块,设置为获取全景拍摄中被摄物体的深度信息;
第二获取模块,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;
第一校正模块,设置为按照所述偏移量对所述被摄物体的所述全景图像进行校正。
在进行全景拍摄时,首先通过测距设备获取被摄物体的深度信息,即被摄物体与拍照设备的距离,然后通过查询预设的映射关系获取与当前距离对应的偏移量,后续相应在全景图像将被摄物体的原始成像进行偏移,形成校正后的全景图像。采用上述方案,解决了相关技术中全景拍摄的图像中物体扭曲较大的问题,精确高效地校正了物体在全景图像上的扭曲,保证了全景成像的美感。
可选地,所述装置还包括建立模块,设置为在所述第二获取模块依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量之前,通过以下方式建立所述映射关系:
第一确定单元,设置为对已知场景进行全景拍摄,依据所述已知场
景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置;
第二确定单元,设置为依据每个物体的理想位置与拍摄画面中实际位置的区别,确定每个物体的偏移量;
建立单元,设置为依据所述每个物体的偏移量,和每个物体的深度信息,建立所述映射关系。
可选地,所述第一确定单元还设置为获取所述已知场景中的不同物体 之间的距离比例;以及用于依据所述距离比例获取每个物体在拍摄画面中的理想位置。
可选地,所述第一校正模块在按照所述偏移量对所述被摄物体的所述全景图像进行校正之前,还设置为获取所述相机的摄像模组镜头的畸变校准参数;以及用于依据所述畸变校准参数对所述全景图像进行校准。
可选地,所述第一校正模块设置为获取所述被摄物体在全景图像中原始成像位置;以及用于将所述原始成像位置按照所述偏移量进行移动。
可选地,所述第一校正模块在按照所述偏移量对所述被摄物体的所述全景图像进行校正之后,还设置为获取校正后的图像为第一图像,依据预设规则检测出所述第一图像中待插值区域;
以及设置为依据所述待插值区域周边预设范围的像素,对所述待插值区域进行插值处理。
可选地,所述第一校正模块还设置为检测所述第一图像中的帧数据之间的拼接处,将所述拼接处作为所述待插值区域;
和/或,还设置为检测所述第一图像中经过偏移的偏移量大于阈值的像素点,将所述像素点作为所述待插值区域。
需要说明的是,上述各个模块是可以通过软件或硬件来实现的,对于后者,可以通过以下方式实现,但不限于此:上述模块均位于同一处理器中;或者,上述各个模块以任意组合的形式分别位于不同的处理器中。
实施例三
根据本申请的另一个实施例,还提供了一种相机,包括:
测距装置,设置为获取全景拍摄中被摄物体的深度信息;
处理器,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;以及设置为按照所述偏移量对所述被摄物体的所述全景图像进行校正。
根据本申请的另一个实施例,还提供了一种移动终端,包括:
测距装置,设置为获取全景拍摄中被摄物体的深度信息;
处理器,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;以及设置为按照所述偏移量对所述被摄物体的所述全景图像进行校正。
本实施例三中的相机和移动终端可以是执行图2流程中的设备。
实施例四
本申请的实施例还提供了一种存储介质。可选地,在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的程序代码:
S1,获取全景拍摄中被摄物体的深度信息;
S2,依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;
S3,按照所述偏移量对所述被摄物体的所述全景图像进行校正。
可选地,在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
本申请的实施例还提供了一种电子装置,包括存储器和处理器,该存储器中存储有计算机程序,该处理器被设置为运行计算机程序以执行上述任一项方法实施例中的步骤。
可选地,上述电子装置还可以包括传输装置以及输入输出设备,其中,该传输装置和上述处理器连接,该输入输出设备和上述处理器连接。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
S1,获取全景拍摄中被摄物体的深度信息;
S2,依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;
S3,按照所述偏移量对所述被摄物体的所述全景图像进行校正。
可选地,本实施例中的具体示例可以参考上述实施例及可选实施方式中所描述的示例,本实施例在此不再赘述。
可选地,本实施例中的具体示例可以参考上述实施例及可选实施方式中所描述的示例,本实施例在此不再赘述。
显然,本领域的技术人员应该明白,上述的本申请的各模块或各步骤可以用通用的计算装置来实现,它们可以集中在单个的计算装置上,或者分布在多个计算装置所组成的网络上,可选地,它们可以用计算装置可执行的程序代码来实现,从而,可以将它们存储在存储装置中由计算装置来执行,并且在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤,或者将它们分别制作成各个集成电路模块,或者将它们中的多个模块或步骤制作成单个集成电路模块来实现。这样,本申请不限制于任何特定的硬件和软件结合。
以上所述仅为本申请的优选实施例而已,并不用于限制本申请,对于本领域的技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。
Claims (18)
- 一种全景拍摄方法,包括:获取全景拍摄中被摄物体的深度信息;依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;按照所述偏移量对所述被摄物体的所述全景图像进行校正。
- 根据权利要求1所述的方法,其中,依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量之前,通过以下方式建立所述映射关系:对已知场景进行全景拍摄,依据所述已知场景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置;依据每个物体的理想位置与拍摄画面中实际位置的区别,确定每个物体的偏移量;依据所述每个物体的偏移量,和每个物体的深度信息,建立所述映射关系。
- 根据权利要求2所述的方法,其中,依据所述已知场景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置,包括:获取所述已知场景中的不同物体之间的距离比例;依据所述距离比例获取每个物体在拍摄画面中的理想位置。
- 根据权利要求1所述的方法,其中,按照所述偏移量对所述被摄物体的所述全景图像进行校正之前,所述方法还包括:获取相机的摄像模组镜头的畸变校准参数;依据所述畸变校准参数对所述全景图像进行校准。
- 根据权利要求1所述的方法,其中,按照所述偏移量对所述被摄物体的所述全景图像进行校正,包括:获取所述被摄物体在全景图像中原始成像位置;将所述原始成像位置按照所述偏移量进行移动。
- 根据权利要求1所述的方法,其中,按照所述偏移量对所述被摄物体的所述全景图像进行校正之后,所述方法还包括:获取校正后的图像为第一图像,依据预设规则检测出所述第一图像中待插值区域;依据所述待插值区域周边预设范围的像素,对所述待插值区域进行插值处理。
- 根据权利要求6所述的方法,其中,依据预设规则检测出所述第一图像中待插值区域,包括以下至少之一:检测所述第一图像中的帧数据之间的拼接处,将所述拼接处作为所述待插值区域;检测所述第一图像中经过偏移的偏移量大于阈值的像素点,将所述像素点作为所述待插值区域。
- 一种全景拍摄装置,包括:第一获取模块,设置为获取全景拍摄中被摄物体的深度信息;第二获取模块,设置为依据预置的深度与偏移量的映射关系获取 与所述被摄物体在全景图像中对应的偏移量;第一校正模块,设置为按照所述偏移量对所述被摄物体的所述全景图像进行校正。
- 根据权利要求8所述的装置,其中,所述装置还包括建立模块,设置为在所述第二获取模块依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量之前,通过以下方式建立所述映射关系:第一确定单元,设置为对已知场景进行全景拍摄,依据所述已知场景中不同物体之间的位置关系,确定不同深度的物体在拍摄画面中的理想位置;第二确定单元,设置为依据每个物体的理想位置与拍摄画面中实际位置的区别,确定每个物体的偏移量;建立单元,设置为依据所述每个物体的偏移量,和每个物体的深度信息,建立所述映射关系。
- 根据权利要求9所述的装置,其中,所述第一确定单元还设置为获取所述已知场景中的不同物体之间的距离比例;以及用于依据所述距离比例获取每个物体在拍摄画面中的理想位置。
- 根据权利要求8所述的装置,其中,所述第一校正模块在按照所述偏移量对所述被摄物体的所述全景图像进行校正之前,还设置为获取相机的摄像模组镜头的畸变校准参数;以及用于依据所述畸变校准参数对所述全景图像进行校准。
- 根据权利要求8所述的装置,其中,所述第一校正模块设置为获取所述被摄物体在全景图像中原始成像位置;以及用于将所述原始成像位置按照所述偏移量进行移动。
- 根据权利要求8所述的装置,其中,所述第一校正模块在按照所述偏移量对所述被摄物体的所述全景图像进行校正之后,还设置为获取校正后的图像为第一图像,依据预设规则检测出所述第一图像中待插值区域;以及设置为依据所述待插值区域周边预设范围的像素,对所述待插值区域进行插值处理。
- 根据权利要求13所述的装置,其中,所述第一校正模块还设置为检测所述第一图像中的帧数据之间的拼接处,将所述拼接处作为所述待插值区域;和/或,还设置为检测所述第一图像中经过偏移的偏移量大于阈值的像素点,将所述像素点作为所述待插值区域。
- 一种相机,包括:测距装置,设置为获取全景拍摄中被摄物体的深度信息;处理器,设置为依据预置的深度与偏移量的映射关系获取与所述被摄物体在全景图像中对应的偏移量;以及用于按照所述偏移量对所述被摄物体的所述全景图像进行校正。
- 一种移动终端,包括:测距装置,设置为获取全景拍摄中被摄物体的深度信息;处理器,设置为依据预置的深度与偏移量的映射关系获取与所述 被摄物体在全景图像中对应的偏移量;以及用于按照所述偏移量对所述被摄物体的所述全景图像进行校正。
- 一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行所述权利要求1至7任一项中所述的方法。
- 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行所述权利要求1至7任一项中所述的方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/418,536 US11523056B2 (en) | 2018-12-28 | 2019-08-20 | Panoramic photographing method and device, camera and mobile terminal |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811629211.6 | 2018-12-28 | ||
CN201811629211.6A CN111385461B (zh) | 2018-12-28 | 2018-12-28 | 全景拍摄方法及装置、相机、移动终端 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020134123A1 true WO2020134123A1 (zh) | 2020-07-02 |
Family
ID=71128293
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/101489 WO2020134123A1 (zh) | 2018-12-28 | 2019-08-20 | 全景拍摄方法及装置、相机、移动终端 |
Country Status (3)
Country | Link |
---|---|
US (1) | US11523056B2 (zh) |
CN (1) | CN111385461B (zh) |
WO (1) | WO2020134123A1 (zh) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114913236A (zh) * | 2021-02-09 | 2022-08-16 | 深圳市汇顶科技股份有限公司 | 相机标定方法、装置及电子设备 |
CN113256512B (zh) * | 2021-04-30 | 2024-06-21 | 北京京东乾石科技有限公司 | 深度图像的补全方法、装置及巡检机器人 |
CN113920011A (zh) * | 2021-09-26 | 2022-01-11 | 安徽光阵光电科技有限公司 | 一种多目全景视频拼接方法及系统 |
CN116452481A (zh) * | 2023-04-19 | 2023-07-18 | 北京拙河科技有限公司 | 一种多角度组合拍摄方法及装置 |
CN117593225B (zh) * | 2023-11-14 | 2024-05-28 | 自行科技(武汉)有限公司 | 电子后视镜下光心偏移处理方法、系统、设备及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101593350A (zh) * | 2008-05-30 | 2009-12-02 | 日电(中国)有限公司 | 深度自适应视频拼接的方法、装置和系统 |
US20120133639A1 (en) * | 2010-11-30 | 2012-05-31 | Microsoft Corporation | Strip panorama |
CN103035005A (zh) * | 2012-12-13 | 2013-04-10 | 广州致远电子股份有限公司 | 一种全景泊车的标定方法,及装置,一种自动标定方法 |
CN105243637A (zh) * | 2015-09-21 | 2016-01-13 | 武汉海达数云技术有限公司 | 一种基于三维激光点云进行全景影像拼接方法 |
CN105262958A (zh) * | 2015-10-15 | 2016-01-20 | 电子科技大学 | 一种虚拟视点的全景特写拼接系统及其方法 |
CN106651755A (zh) * | 2016-11-17 | 2017-05-10 | 宇龙计算机通信科技(深圳)有限公司 | 一种终端全景图像处理方法、装置及终端 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2477793A (en) * | 2010-02-15 | 2011-08-17 | Sony Corp | A method of creating a stereoscopic image in a client device |
CN106447735A (zh) * | 2016-10-14 | 2017-02-22 | 安徽协创物联网技术有限公司 | 一种全景相机几何标定处理方法 |
US10248890B2 (en) * | 2017-04-13 | 2019-04-02 | Facebook, Inc. | Panoramic camera systems |
CN107071281A (zh) * | 2017-04-19 | 2017-08-18 | 珠海市魅族科技有限公司 | 全景拍摄方法及装置 |
CN108805801A (zh) * | 2018-05-24 | 2018-11-13 | 北京华捷艾米科技有限公司 | 一种全景图像校正方法及系统 |
-
2018
- 2018-12-28 CN CN201811629211.6A patent/CN111385461B/zh active Active
-
2019
- 2019-08-20 WO PCT/CN2019/101489 patent/WO2020134123A1/zh active Application Filing
- 2019-08-20 US US17/418,536 patent/US11523056B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101593350A (zh) * | 2008-05-30 | 2009-12-02 | 日电(中国)有限公司 | 深度自适应视频拼接的方法、装置和系统 |
US20120133639A1 (en) * | 2010-11-30 | 2012-05-31 | Microsoft Corporation | Strip panorama |
CN103035005A (zh) * | 2012-12-13 | 2013-04-10 | 广州致远电子股份有限公司 | 一种全景泊车的标定方法,及装置,一种自动标定方法 |
CN105243637A (zh) * | 2015-09-21 | 2016-01-13 | 武汉海达数云技术有限公司 | 一种基于三维激光点云进行全景影像拼接方法 |
CN105262958A (zh) * | 2015-10-15 | 2016-01-20 | 电子科技大学 | 一种虚拟视点的全景特写拼接系统及其方法 |
CN106651755A (zh) * | 2016-11-17 | 2017-05-10 | 宇龙计算机通信科技(深圳)有限公司 | 一种终端全景图像处理方法、装置及终端 |
Also Published As
Publication number | Publication date |
---|---|
US11523056B2 (en) | 2022-12-06 |
CN111385461B (zh) | 2022-08-02 |
CN111385461A (zh) | 2020-07-07 |
US20220124247A1 (en) | 2022-04-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020134123A1 (zh) | 全景拍摄方法及装置、相机、移动终端 | |
CN108600576B (zh) | 图像处理装置、方法和系统以及计算机可读记录介质 | |
JP6263623B2 (ja) | 画像生成方法及びデュアルレンズ装置 | |
US9558543B2 (en) | Image fusion method and image processing apparatus | |
CN109474780B (zh) | 一种用于图像处理的方法和装置 | |
WO2017016050A1 (zh) | 一种图像的预览方法、装置及终端 | |
WO2019232793A1 (zh) | 双摄像头标定方法、电子设备、计算机可读存储介质 | |
CN110717942A (zh) | 图像处理方法和装置、电子设备、计算机可读存储介质 | |
CN110689581A (zh) | 结构光模组标定方法、电子设备、计算机可读存储介质 | |
WO2016155110A1 (zh) | 图像透视畸变校正的方法及系统 | |
JP5846172B2 (ja) | 画像処理装置、画像処理方法、プログラムおよび撮像システム | |
TW201024908A (en) | Panoramic image auto photographing method of digital photography device | |
WO2020024576A1 (zh) | 摄像头校准方法和装置、电子设备、计算机可读存储介质 | |
CN109785390B (zh) | 一种用于图像矫正的方法和装置 | |
WO2016029465A1 (zh) | 一种图像处理方法、装置及电子设备 | |
CN111654624B (zh) | 拍摄提示方法、装置及电子设备 | |
CN109785225B (zh) | 一种用于图像矫正的方法和装置 | |
CN114640833A (zh) | 投影画面调整方法、装置、电子设备和存储介质 | |
JP6222205B2 (ja) | 画像処理装置 | |
TW201824178A (zh) | 全景即時影像處理方法 | |
JP2019053758A (ja) | 画像処理装置 | |
TW201342303A (zh) | 三維空間圖像的獲取系統及方法 | |
CN109191396B (zh) | 人像处理方法和装置、电子设备、计算机可读存储介质 | |
CN109379521B (zh) | 摄像头标定方法、装置、计算机设备和存储介质 | |
CN109584313B (zh) | 摄像头标定方法、装置、计算机设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19904428 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15.11.2021) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19904428 Country of ref document: EP Kind code of ref document: A1 |