CN115147268A - Live view method, panoramic camera, and computer-readable storage medium - Google Patents

Live view method, panoramic camera, and computer-readable storage medium Download PDF

Info

Publication number
CN115147268A
CN115147268A CN202110350854.2A CN202110350854A CN115147268A CN 115147268 A CN115147268 A CN 115147268A CN 202110350854 A CN202110350854 A CN 202110350854A CN 115147268 A CN115147268 A CN 115147268A
Authority
CN
China
Prior art keywords
longitude
latitude
panoramic camera
lens
dimensional coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110350854.2A
Other languages
Chinese (zh)
Inventor
何红烨
郭奕滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Insta360 Innovation Technology Co Ltd
Original Assignee
Insta360 Innovation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Insta360 Innovation Technology Co Ltd filed Critical Insta360 Innovation Technology Co Ltd
Priority to CN202110350854.2A priority Critical patent/CN115147268A/en
Priority to PCT/CN2022/083566 priority patent/WO2022206728A1/en
Publication of CN115147268A publication Critical patent/CN115147268A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a real-time view finding method of a panoramic camera, which comprises the following steps: generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera; calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera; calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate; mapping the second three-dimensional coordinate into longitude and latitude; generating a real-time preview picture on a touch display screen according to the longitude and latitude and the longitude and latitude map mapping table; the longitude and latitude map mapping table comprises the abscissa, the ordinate and the weight value of each lens corresponding to any point on a spherical surface with the panoramic camera as the center. Compared with the prior art, the method and the device do not need to carry out global splicing on the images of the plurality of lenses, only need to splice the images needing to be displayed on the display screen, and have the advantages of good real-time performance, convenience in operation and low implementation cost.

Description

Live view method, panoramic camera, and computer-readable storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a real-time viewing method for a panoramic camera, and a computer-readable storage medium.
Background
The screen view of the existing panoramic camera can only view the view of the influence right in front of the lens, if the 360-degree view is to be achieved, video preview streams need to be transmitted to clients such as mobile phones, the clients are spliced into panoramic images, and then the view at any angle is achieved on the screen of the mobile phone client.
The above-described framing approach suffers from the following two major drawbacks.
1. Real-time preview cannot be achieved.
Latency is necessarily increased because it takes a certain amount of time to stream the video preview to the client and it takes a lot of time to stitch the images.
2. The operation is inconvenient and the implementation cost is high.
In addition, the function can be realized only by an additional client, so that the operation is inevitably inconvenient, and the real-time cost is high.
Therefore, there is a need for an improvement of the existing live view method of the panoramic camera.
Disclosure of Invention
The invention aims to provide a real-time viewing method of a panoramic camera, the panoramic camera and a computer readable storage medium, aiming at solving the existing defects.
In a first aspect, the present invention provides a method for live view of a panoramic camera, the method including:
generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera; calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera; calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate; mapping the second three-dimensional coordinate into longitude and latitude; generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens; the longitude and latitude map mapping table comprises the abscissa, the ordinate and the weight value of each lens corresponding to any point on a spherical surface with the panoramic camera as the center.
In a second aspect, the invention provides a panoramic camera, which comprises a camera main body, at least two lenses, a touch display screen arranged on the camera main body, and a longitude and latitude map mapping table module, wherein the longitude and latitude map mapping table module is used for generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera; the first calculation module is used for calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera; the second calculation module is used for calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate; the mapping module is used for mapping the second three-dimensional coordinate into longitude and latitude; the preview picture module is used for generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens; the longitude and latitude map mapping table comprises the abscissa, the ordinate and the weight value of each lens corresponding to any point on a spherical surface with the panoramic camera as the center.
In a third aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a live view method of the above-described panoramic camera.
Compared with the prior art, the method and the device have the advantages that the images of the multiple lenses do not need to be spliced globally, only the images needing to be displayed on the display screen need to be spliced, real-time view finding at any angle on the screen of the touch display screen of the panoramic camera is realized, and the direction of screen preview display is realized through the sliding of fingers, so that the shooting or video recording of a user is better assisted, the use experience of the user is improved, and the method and the device have the advantages of good real-time performance, convenience in operation and low implementation cost.
Drawings
Fig. 1 is a flowchart of a live view method of a panoramic camera in embodiment 1 of the present invention.
Fig. 2 is a block diagram showing a configuration of a panoramic camera in embodiment 2 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more clearly understood, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
In order to illustrate the technical means of the present invention, the following description is given by way of specific examples.
Example 1
As shown in fig. 1, a preferred embodiment of the live view method of the panoramic camera in the present embodiment includes the following steps.
S1, generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera.
The panoramic camera in this embodiment includes at least two lenses (e.g., fisheye lenses or wide-angle lenses), and the partial fields of view of two adjacent lenses overlap to form a 360 ° panoramic field of view. The process of generating the longitude and latitude map mapping table in this embodiment is as follows: firstly, carrying out parameter calibration on real-time image data (video) shot by a panoramic camera to obtain calibration parameters, wherein the contents of the calibration parameters comprise the horizontal angle, the pitching angle, the spinning angle and the like of each lens; and then establishing a corresponding lens longitude and latitude table for each lens, and combining the longitude and latitude map table by combining the calibration parameters, so that each point in the space can find the coordinates and the weight of the real-time image data corresponding to at least one lens. The longitude and latitude map mapping table comprises an abscissa, an ordinate and a weight value, wherein the abscissa represents longitude of a spherical surface with the panoramic camera as a center, and the ordinate represents latitude of the spherical surface with the panoramic camera as the center. Wherein, the taking of the panoramic camera as the center comprises taking the geometric center of the overall shape of the panoramic camera or the gravity center of the panoramic camera as the center; in this embodiment, the geometric center of the focal points of the respective lenses of the panoramic camera may also be taken as the center, for example, when the number of lenses is two, the center is the midpoint of the connecting line of the two lens focal points; when the number of the lenses is three, the center is the gravity center of a triangle formed by three lens focuses; when the number of the lenses is four, the center is the intersection point of the diagonals of the quadrangle formed by the four lens focuses.
In the process of synthesizing a longitude and latitude map mapping table by using a lens longitude and latitude table, setting a weight for a point which is simultaneously located in images generated by two adjacent lenses according to a distance from the point to the center position of the images generated by the two lenses, wherein the closer the point to the center of the image generated by a certain lens, the larger the weight is, and the sum of the two weight values is 1, so as to effectively eliminate a splicing gap between the two lens images, for example, the weight of each lens to a certain point can be calculated in a weighted average manner, specifically, assuming that the distance from a certain point to a first lens is d1 and the distance from the certain point to a second lens is d2, the weight of the first lens is d 2/(d 1+ d 2), and the weight of the second lens is d 1/(d 1+ d 2); for a point in the image generated by only one shot, the weight of the shot at the point is set to be 1, so as to ensure the definition of the full-space image. In the above process, for a point located in an image generated by three or more lenses at the same time, two lenses with the smallest distance from the point to the center position of the image are selected, and different weights are set according to the distance between the point and the center position of the image generated by the two lenses.
And S2, calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera.
In the line projection mode, assuming that the front of the panoramic camera is framed, the coordinates of the image displayed on the screen of the touch display screen are calculated as follows.
For a screen coordinate (i, j) of any point in the screen, a method for calculating a first three-dimensional coordinate (X1, Y1, Z1) of a display image thereof is: x1= tan (PI 0.5-fov PI/180 0.5) rayW 0.5; y1= (j-rayw0.5); z1= (i-rayH 0.5); here, rayW represents the width of the screen, rayH represents the height of the screen, fov represents the angle of view of the image displayed on the screen, and PI represents the circumferential ratio PI.
In this way, the first three-dimensional coordinates of all the points in the touch display screen of the panoramic camera can be obtained.
And S3, calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate.
In this step, a sliding state of a finger or a touch pen on a screen of the touch display screen is detected. In this embodiment, the left-right sliding on the screen indicates that the viewing angle swings on the yaw axis, the up-down sliding on the screen indicates that the viewing angle swings on the pitch axis, and since the sliding of the finger or the stylus on the screen of the touch display screen does not involve the turning of the screen, there is no roll axis swing, that is, the roll axis angle is kept at 0, the detected arc of the left-right sliding of the screen is represented by θ, and the detected arc of the up-down sliding of the screen is represented by Φ, then rot of the 3*3 rotation matrix can be calculated:
rot(0, 0) = cos(φ) * cos(θ);
rot(0, 1) = - sinf(θ);
rot(0, 2) = sinf(φ) * cosf(θ);
rot(1, 0) = cosf(φ) * sinf(θ);
rot(1, 1) = cosf(θ);
rot(1, 2) = sinf(φ) * sinf(θ);
rot(2, 0) = -sinf(φ);
rot(2, 1) = 0;
rot(2, 2) = cosf(φ);
for any one of the first three-dimensional coordinates (X1, Y1, Z1), the second three-dimensional coordinate (X2, Y2, Z2) of the display image is calculated by:
X2 =X1* rot(0, 0) + Y1 * rot(0, 1) +Z1 * rot(0, 2);
Y2= X1 * rot(1, 0) + Y1 * rot(1, 1) + Z1 * rot(1, 2);
Z2 =X1 * rot(2, 0) + Y1 * rot(2, 1) + Z1 * rot(2, 2);
substituting rot of 3*3 rotation matrix to obtain a calculation formula of a second three-dimensional coordinate (X2, Y2, Z2):
X2 =X1 * cos(φ) * cos(θ) + Y1* (- sinf(θ)) + Z1* sinf(φ) * cosf(θ);
Y2 =X1 *cosf(φ) * sinf(θ) + Y1 * cosf(θ) + Z1 * sinf(φ) * sinf(θ);
Z2 =X1 * (-sinf(φ)) + Y1 *0 +Z1 * cosf(φ)。
and S4, mapping the second three-dimensional coordinate into longitude and latitude.
For any second three-dimensional coordinate (X2, Y2, Z2), the longitude and latitude (fi, theta) calculation method comprises the following steps:
fi=atan2f(Y2,X2);
theta=PI * 0.5-atan2f(Z2,sqrt(X2*X2+Y2*Y2));
and when the calculated fi value is less than 0, adjusting the fi value to be 2 × PI-fi to ensure that the fi range is between 0 and 2PI. Where PI denotes the circumferential ratio π, and atan2f (a, b) is the arctangent of b/a in radians.
And S5, generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens.
Searching the coordinates and weighted values of the real-time image data (video) of the longitude and latitude map mapping table according to the longitude and latitude of the second three-dimensional coordinate obtained in the step S4; and copying the corresponding color value or the color value after weight calculation in the coordinate into a target picture to obtain a real-time preview picture, and displaying the real-time preview picture through a screen of a touch display screen.
According to the embodiment, the content to be displayed on the touch display screen of the panoramic camera only needs to be spliced, and the overall splicing of the pictures of all the lenses is not needed, so that the processing speed is high, and the method and the device are suitable for the real-time framing of the panoramic camera.
Example 2
As shown in fig. 2, which is a block diagram of the panoramic camera in this embodiment, the panoramic camera in this embodiment includes two fisheye lenses (which may be in other numbers) and a touch display screen, where the two fisheye lenses are respectively installed on two opposite sides of the panoramic camera, and a partial view field is overlapped between the two fisheye lenses to form a 360 ° panoramic view field; the touch display screen is substantially rectangular and is used for displaying a preview image of the panoramic camera. The panoramic camera further includes: the longitude and latitude map mapping table module is used for generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera; the first calculation module is used for calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera; the second calculation module is used for calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate; the mapping module is used for mapping the second three-dimensional coordinate into longitude and latitude; and the preview picture module is used for generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens. The longitude and latitude map mapping table comprises the abscissa, the ordinate and the weight value of each lens corresponding to any point on a spherical surface with the panoramic camera as the center.
Example 3
A computer-readable storage medium is disclosed in this embodiment, wherein a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the live view method of the panoramic camera in embodiment 1.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing associated hardware, and the storage medium may be a computer-readable storage medium, such as a ferroelectric Memory (FRAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Erasable Programmable Read Only Memory (EEPROM), a flash Memory, a magnetic surface Memory, an optical disc, or a Compact disc Read Only Memory (CD-ROM), etc.; or may be various devices including one or any combination of the above memories.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A method for live-viewing of a panoramic camera, comprising:
generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera;
calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera;
calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate;
mapping the second three-dimensional coordinate into longitude and latitude;
generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens;
the longitude and latitude map mapping table comprises the abscissa, the ordinate and the weight value of each lens corresponding to any point on a spherical surface with the panoramic camera as the center.
2. The method of claim 1, wherein the generating the longitude and latitude map table according to the calibration parameters of each lens of the panoramic camera comprises: the method comprises the steps of calibrating parameters of real-time image data shot by a panoramic camera to obtain calibration parameters, establishing a corresponding lens longitude and latitude table for each lens, and combining the lens calibration parameters and the lens longitude and latitude tables to generate a longitude and latitude map mapping table.
3. The real-time viewing method of the panoramic camera according to claim 1, wherein the generating of the longitude and latitude map mapping table by combining the lens calibration parameters and the lens longitude and latitude table specifically comprises: for a point located in the images generated by two adjacent shots at the same time, a weight is set according to its distance from the center position of the images generated by the two shots.
4. The method for live-viewing of a panoramic camera according to claim 1, wherein the calculating of the first three-dimensional coordinates of the image displayed on the screen of the touch display of the panoramic camera is: for a screen coordinate (i, j) of any point in the screen, a method for calculating a first three-dimensional coordinate (X1, Y1, Z1) of a display image thereof is:
X1=tan(PI * 0.5 - fov * PI / 180 * 0.5) * rayW*0.5;
Y1=(j- rayW*0.5);
Z1=(i- rayH*0.5);
here, rayW represents the width of the screen, rayH represents the height of the screen, fov represents the angle of view of the image displayed on the screen, and PI represents the circumferential ratio PI.
5. The live view method of a panoramic camera according to claim 2, wherein the calculation method of calculating the second three-dimensional coordinate from the detected touch direction and angle of the touch screen and the first three-dimensional coordinate is: for any one of the first three-dimensional coordinates (X1, Y1, Z1), the second three-dimensional coordinate (X2, Y2, Z2) of the display image is calculated by:
X2 =X1 * cos(φ) * cos(θ) + Y1* (- sinf(θ)) + Z1* sinf(φ) * cosf(θ);
Y2 =X1 *cosf(φ) * sinf(θ) + Y1 * cosf(θ) + Z1 * sinf(φ) * sinf(θ);
Z2 =X1 * (-sinf(φ)) + Y1 *0 +Z1 * cosf(φ);
where θ is the radian of the detected screen sliding left and right, and φ is the radian of the detected screen sliding up and down.
6. The method of claim 1, wherein the mapping the second three-dimensional coordinates to longitude and latitude is specifically: for any second three-dimensional coordinate (X2, Y2, Z2), the longitude and latitude (fi, theta) calculation method comprises the following steps:
fi=atan2f(Y2,X2);
theta=PI * 0.5-atan2f(Z2,sqrt(X2*X2+Y2*Y2));
where PI denotes the circumferential ratio π, and atan2f (a, b) is the arctangent of b/a in radians.
7. The method of claim 1, wherein the generating a real-time preview screen on the touch display screen according to the longitude and latitude, the longitude and latitude map table, and the acquired real-time image data of each lens comprises:
searching each coordinate of the real-time image data corresponding to each lens of the longitude and latitude map according to the obtained longitude and latitude of the second three-dimensional coordinate;
and copying the corresponding color value or the color value after weight calculation in each coordinate to a target graph to obtain a real-time preview picture.
8. The utility model provides a panoramic camera, includes two at least camera lenses and establishes touch display screen, its characterized in that still includes:
the longitude and latitude map mapping table module is used for generating a longitude and latitude map mapping table according to the calibration parameters of each lens of the panoramic camera;
the first calculation module is used for calculating a first three-dimensional coordinate of an image displayed on a screen of a touch display screen of the panoramic camera;
the second calculation module is used for calculating a second three-dimensional coordinate according to the detected touch direction and angle of the touch display screen and the first three-dimensional coordinate;
the mapping module is used for mapping the second three-dimensional coordinate into longitude and latitude;
the preview picture module is used for generating a real-time preview picture on the touch display screen according to the longitude and latitude, the longitude and latitude map mapping table and the acquired real-time image data of each lens;
the longitude and latitude map mapping table comprises an abscissa, an ordinate and a weight value.
9. The panoramic camera of claim 8, further comprising:
and the weighting calculation module is used for performing weighted average calculation according to the corresponding specific coordinates of each real-time image data and the distance from the specific coordinates to the center of each real-time image when the abscissa and the ordinate correspond to the specific coordinates of the real-time image data of the plurality of lenses so as to obtain the coordinate values.
10. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, implements a live view method of a panoramic camera of any one of claims 1 to 7.
CN202110350854.2A 2021-03-31 2021-03-31 Live view method, panoramic camera, and computer-readable storage medium Pending CN115147268A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110350854.2A CN115147268A (en) 2021-03-31 2021-03-31 Live view method, panoramic camera, and computer-readable storage medium
PCT/CN2022/083566 WO2022206728A1 (en) 2021-03-31 2022-03-29 Real-time framing method, panoramic camera, and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110350854.2A CN115147268A (en) 2021-03-31 2021-03-31 Live view method, panoramic camera, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN115147268A true CN115147268A (en) 2022-10-04

Family

ID=83404463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110350854.2A Pending CN115147268A (en) 2021-03-31 2021-03-31 Live view method, panoramic camera, and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN115147268A (en)
WO (1) WO2022206728A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503492B (en) * 2023-06-27 2024-06-14 北京鉴智机器人科技有限公司 Binocular camera module calibration method and calibration device in automatic driving system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4825980B2 (en) * 2007-03-06 2011-11-30 国立大学法人岩手大学 Calibration method for fisheye camera.
KR101383997B1 (en) * 2013-03-08 2014-04-10 홍익대학교 산학협력단 Real-time video merging method and system, visual surveillance system and virtual visual tour system using the real-time video merging
US9940697B2 (en) * 2016-04-15 2018-04-10 Gopro, Inc. Systems and methods for combined pipeline processing of panoramic images
CN107071268A (en) * 2017-01-20 2017-08-18 深圳市圆周率软件科技有限责任公司 A kind of many mesh panorama camera panorama mosaic methods and system
CN107948662A (en) * 2017-12-04 2018-04-20 深圳岚锋创视网络科技有限公司 The method, apparatus and panorama camera of live preview panorama during a kind of shooting

Also Published As

Publication number Publication date
WO2022206728A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
US10645284B2 (en) Image processing device, image processing method, and recording medium storing program
WO2021227359A1 (en) Unmanned aerial vehicle-based projection method and apparatus, device, and storage medium
US20210374994A1 (en) Gaze point calculation method, apparatus and device
CN106331527B (en) A kind of image split-joint method and device
CN107133918B (en) Method for generating panorama at any position in three-dimensional scene
CN104778656B (en) Fisheye image correcting method based on spherical perspective projection
WO2017080280A1 (en) Depth image composition method and apparatus
US9467620B2 (en) Synthetic camera lenses
CN111737518A (en) Image display method and device based on three-dimensional scene model and electronic equipment
WO2013069555A1 (en) Image processing device, method, and program
JP2007257100A (en) Method for creating panoramic image
CN109685721B (en) Panoramic picture splicing method, device, terminal and corresponding storage medium
CN114286066A (en) Projection correction method, projection correction device, storage medium and projection equipment
US20220405968A1 (en) Method, apparatus and system for image processing
US20090059018A1 (en) Navigation assisted mosaic photography
WO2022206728A1 (en) Real-time framing method, panoramic camera, and computer readable storage medium
CN112529006B (en) Panoramic picture detection method, device, terminal and storage medium
GB2557212A (en) Methods and apparatuses for determining positions of multi-directional image capture apparatuses
CN115100026B (en) Label coordinate conversion method, device, equipment and storage medium based on target object
TWM594322U (en) Camera configuration system with omnidirectional stereo vision
WO2018150086A2 (en) Methods and apparatuses for determining positions of multi-directional image capture apparatuses
WO2021149509A1 (en) Imaging device, imaging method, and program
WO2022036512A1 (en) Data processing method and device, terminal, and storage medium
JP7439398B2 (en) Information processing equipment, programs and information processing systems
CN113674356A (en) Camera screening method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination