CN111325798A - Camera model correction method and device, AR implementation equipment and readable storage medium - Google Patents

Camera model correction method and device, AR implementation equipment and readable storage medium Download PDF

Info

Publication number
CN111325798A
CN111325798A CN201811524847.4A CN201811524847A CN111325798A CN 111325798 A CN111325798 A CN 111325798A CN 201811524847 A CN201811524847 A CN 201811524847A CN 111325798 A CN111325798 A CN 111325798A
Authority
CN
China
Prior art keywords
shooting
target
real
camera model
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811524847.4A
Other languages
Chinese (zh)
Other versions
CN111325798B (en
Inventor
张鹏国
许红锦
周人弈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN201811524847.4A priority Critical patent/CN111325798B/en
Publication of CN111325798A publication Critical patent/CN111325798A/en
Application granted granted Critical
Publication of CN111325798B publication Critical patent/CN111325798B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a camera model correction method, a camera model correction device, AR implementation equipment and a readable storage medium. The method is applied to an AR implementation device connected with a real camera, wherein the AR implementation device stores a virtual camera model with synchronous shooting operation and the real camera shooting operation, and the method comprises the following steps: detecting the current imaging magnification numerical value of a real camera in real time; acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model; the shooting parameter value currently used by the virtual camera model is adjusted according to the target correction shooting parameter, so that the virtual camera model is matched with the real camera in real time, the fusion deviation between the virtual camera model and the real camera is reduced, the AR fusion accuracy is increased, and the AR effect is improved.

Description

Camera model correction method and device, AR implementation equipment and readable storage medium
Technical Field
The present application relates to the technical field of AR (Augmented Reality) implementation, and in particular, to a camera model correction method, an apparatus, an AR implementation device, and a readable storage medium.
Background
With the continuous development of scientific technology, the application field of the AR technology is becoming wider, wherein the video monitoring field is an important component of the application field of the AR technology. At present, the AR effect is usually realized in a manner of extracting panoramic features from an image acquired by a real camera and completely matching the extracted panoramic features to a virtual scene constructed by a virtual camera model. The existing AR effect implementation method standardizes the virtual camera model as much as possible in the implementation process to obtain a better AR effect through the standardization of the virtual camera model, but the AR effect implementation method does not consider that fusion deviation exists between the virtual camera model and the real camera due to physical differences (for example, different lens shooting accuracies) of the real camera, so that deviation exists in the AR superposition effect on the final real-time video, and the specific AR effect is not good.
Disclosure of Invention
In order to overcome the above defects in the prior art, an object of the present application is to provide a camera model correction method, a camera model correction device, an AR implementation device, and a readable storage medium, where the camera model correction method can correct a virtual camera model according to the current physical characteristics of a real camera, so that the virtual camera model is matched with the real camera in real time, thereby increasing the accuracy of AR fusion and improving the AR effect.
As a method, an embodiment of the present application provides a method for correcting a camera model, where the method is applied to an Augmented Reality (AR) implementation device connected to a real camera for capturing an image, and a virtual camera model whose capturing operation is synchronized with that of the real camera is stored in the AR implementation device, the method including:
detecting the current imaging magnification numerical value of the real camera in real time;
acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model;
and adjusting the current shooting parameter value used by the virtual camera model according to the target correction shooting parameter so as to enable the virtual camera model to be correspondingly matched with the real camera.
Optionally, in an embodiment of the present application, the method further includes:
testing correction shooting parameters corresponding to the virtual camera model under different imaging magnifications of the real camera;
and drawing and forming the parameter correction curve based on the corrected shooting parameters obtained by the test.
Optionally, in this embodiment of the application, the step of testing the corrected shooting parameters of the virtual camera model corresponding to different imaging magnifications of the real camera includes:
after the imaging magnification of the real camera is adjusted each time, acquiring a first shooting position of a target object under the current imaging magnification in a screen coordinate system corresponding to the real camera;
controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position;
when the target real object is located at the target first shooting position in the screen coordinate system, acquiring a second shooting position, corresponding to the target real object, in a world coordinate system corresponding to the virtual camera model under the current shooting parameters of the virtual camera model;
obtaining a corresponding target second shooting position of the second shooting position in the screen coordinate system according to a conversion relation between the world coordinate and the screen coordinate;
comparing the position of the target first shooting position with the position of the target second shooting position under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident until the target second shooting position under the adjusted shooting parameters is coincident with the target first shooting position;
and taking the shooting parameters used when the second target shooting position is superposed with the first target shooting position as correction shooting parameters corresponding to the current imaging magnification.
Optionally, in this embodiment of the application, the step of forming the parameter correction curve based on the corrected shooting parameters obtained by the test by drawing includes:
sequencing all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sequenced parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter;
and drawing linear change curves between two adjacent groups of parameter correction data in the sorted multiple groups of parameter correction data, and performing curve adjustment on the drawn linear change curves to obtain the parameter correction curves.
As for an apparatus, an embodiment of the present application provides a camera model correction apparatus, where the apparatus is applied to an AR implementation device connected to a real camera used for capturing an image, and a virtual camera model whose capturing operation is synchronized with that of the real camera is stored in the AR implementation device, the apparatus includes:
the magnification detection module is used for detecting the current imaging magnification numerical value of the real camera in real time;
the parameter acquisition module is used for acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model;
and the parameter adjusting module is used for adjusting the shooting parameter value currently used by the virtual camera model according to the target correction shooting parameter so as to enable the virtual camera model to be correspondingly matched with the real camera.
Optionally, in an embodiment of the present application, the apparatus further includes:
the parameter testing module is used for testing the corrected shooting parameters corresponding to the virtual camera model under different imaging magnifications of the real camera;
and the curve drawing module is used for drawing and forming the parameter correction curve based on the corrected shooting parameters obtained by the test.
Optionally, in an embodiment of the present application, the parameter testing module is specifically configured to:
after the imaging magnification of the real camera is adjusted each time, acquiring a first shooting position of a target object under the current imaging magnification in a screen coordinate system corresponding to the real camera;
controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position;
when the target real object is located at the target first shooting position in the screen coordinate system, acquiring a second shooting position, corresponding to the target real object, in a world coordinate system corresponding to the virtual camera model under the current shooting parameters of the virtual camera model;
obtaining a corresponding target second shooting position of the second shooting position in the screen coordinate system according to a conversion relation between the world coordinate and the screen coordinate;
comparing the position of the target first shooting position with the position of the target second shooting position under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident until the target second shooting position under the adjusted shooting parameters is coincident with the target first shooting position;
and taking the shooting parameters used when the second target shooting position is superposed with the first target shooting position as correction shooting parameters corresponding to the current imaging magnification.
Optionally, in this embodiment of the application, the curve drawing module is specifically configured to:
sequencing all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sequenced parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter;
and drawing linear change curves between two adjacent groups of parameter correction data in the sorted multiple groups of parameter correction data, and performing curve adjustment on the drawn linear change curves to obtain the parameter correction curves.
As for the device, an embodiment of the present application further provides an AR implementation device, where the AR implementation device includes a processor and a non-volatile memory storing computer instructions, and when the computer instructions are executed by the processor, the AR implementation device executes any one of the above camera model correction methods, where the AR implementation device is connected to a real camera used for capturing an image, and the AR implementation device stores therein a virtual camera model whose capturing operation is synchronized with that of the real camera.
As for the storage medium, an embodiment of the present application further provides a readable storage medium, where the readable storage medium includes a computer program, and the computer program controls, when running, an AR implementation apparatus where the readable storage medium is located to perform any one of the above-mentioned camera model correction methods.
Compared with the prior art, the camera model correction method, the camera model correction device, the AR implementation device and the readable storage medium provided by the embodiment of the application have the following beneficial effects: the camera model correction method can correct the virtual camera model according to the current physical characteristics of the real camera, so that the virtual camera model is matched with the real camera in real time, the AR fusion accuracy is increased, and the AR effect is improved. Firstly, the method detects the current imaging magnification numerical value of the real camera in real time; then, according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model, acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera; finally, the method adjusts the current shooting parameter value of the virtual camera model according to the target correction shooting parameter, so that the virtual camera model is matched with the real camera in real time, the fusion deviation between the virtual camera model and the real camera is reduced, and the AR fusion precision is increased, thereby improving the corresponding AR effect.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments are briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope of the claims of the present application, and it is obvious for those skilled in the art that other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a block diagram of an AR implementation apparatus according to an embodiment of the present disclosure.
Fig. 2 is a schematic flowchart of a first method for correcting a camera model according to an embodiment of the present disclosure.
Fig. 3 is a second flowchart of a camera model correction method according to an embodiment of the present disclosure.
Fig. 4 is a flowchart illustrating sub-steps included in step S210 shown in fig. 3.
Fig. 5 is a flowchart illustrating sub-steps included in step S220 shown in fig. 3.
Fig. 6 is a first block diagram of a camera model correction apparatus according to an embodiment of the present disclosure.
Fig. 7 is a second block diagram of a camera model correction apparatus according to an embodiment of the present disclosure.
Icon: 10-AR implementing equipment; 11-a memory; 12-a processor; 13-a communication unit; 100-camera model correction means; 130-magnification detection module; 140-parameter acquisition module; 150-parameter adjustment module; 110-a parameter testing module; 120-curve drawing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
In the description of the present application, it is noted that the terms "first", "second", "third", and the like are used merely for distinguishing between descriptions and are not intended to indicate or imply relative importance.
In the description of the present application, it is further noted that, unless expressly stated or limited otherwise, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a block diagram of an AR implementation apparatus 10 according to an embodiment of the present disclosure. In this embodiment of the present application, the AR implementation device 10 is in communication connection with a real camera for shooting an image through a wired network or a wireless network, a virtual camera model whose shooting operation is synchronous with that of the real camera is stored in the AR implementation device 10, and the AR implementation device 10 implements an AR effect through cooperation between the virtual camera model and the real camera. When the real camera rotates to shoot, the virtual camera model also rotates to shoot, and the rotation direction and the rotation amplitude of the virtual camera model are respectively the same as those of the real camera, namely the shooting operation of the virtual camera model is synchronous with that of the real camera. In this embodiment, the AR implementing device 10 may correct the virtual camera model according to the physical characteristics, such as imaging magnification, currently exhibited by the real camera, so that the virtual camera model is matched with the real camera in real time, the fusion deviation between the virtual camera model and the real camera is reduced, and the AR fusion accuracy is increased, thereby improving the corresponding AR effect. The AR implementation device 10 may be, but is not limited to, a smart phone, a Personal Computer (PC), a tablet PC, a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), and the like.
In the present embodiment, the AR implementing apparatus 10 includes a camera model correcting device 100, a memory 11, a processor 12, and a communication unit 13. The memory 11, the processor 12 and the communication unit 13 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the memory 11, the processor 12 and the communication unit 13 may be electrically connected to each other through one or more communication buses or signal lines.
In this embodiment, the memory 11 is a non-volatile memory, the memory 11 may be configured to store a virtual camera model with a shooting operation synchronized with a shooting operation of the real camera, and the memory 11 may be further configured to store a parameter correction curve between an imaging magnification of the real camera and a corrected shooting parameter of the virtual camera model, where the parameter correction curve is used to represent an adaptation rule that a shooting parameter currently used by the virtual camera model changes along with a change of the imaging magnification of the real camera, so as to ensure that the virtual camera model can be matched with the real camera in real time based on the parameter correction curve, thereby reducing a fusion deviation between the virtual camera model and the real camera, increasing an AR fusion accuracy, and improving a corresponding AR effect. The shooting parameter may be a shooting focal length of the virtual camera model, or may be a shooting angle of view of the virtual camera model. In this embodiment, the memory 11 may also be used for storing a program, and the processor 12 may execute the program accordingly after receiving the execution instruction.
In this embodiment, the processor 12 may be an integrated circuit chip having signal processing capabilities. The Processor 12 may be a general-purpose Processor including a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a Network Processor (NP), and the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor or the like that implements or executes the methods, steps and logic blocks disclosed in the embodiments of the present application.
In this embodiment, the communication unit 13 is configured to establish a communication connection between the AR implementation device 10 and another electronic device through a network, and send and receive data through the network.
In this embodiment, the camera model correcting apparatus 100 includes at least one software functional module that can be stored in the memory 11 in the form of software or firmware or solidified in the operating system of the AR implementing device 10. The processor 12 may be used to execute executable modules stored by the memory 11, such as software functional modules and computer programs included in the camera model correction device 100. The AR implementing apparatus 10 corrects the virtual camera model in real time through the camera model correcting device 100, so that the virtual camera model is matched with the real camera in real time, reduces a fusion deviation between the virtual camera model and the real camera, increases AR fusion accuracy, and thus improves a corresponding AR effect.
It will be appreciated that the block diagram shown in fig. 1 is merely a schematic diagram of one configuration of an AR implementation device 10, and that the AR implementation device 10 may include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Fig. 2 is a schematic flow chart illustrating a first method for correcting a camera model according to an embodiment of the present disclosure. In this embodiment of the present application, the camera model correction method is applied to an AR implementation device 10, the AR implementation device 10 is connected to a real camera for shooting an image, and a virtual camera model whose shooting operation is synchronized with that of the real camera is stored in the AR implementation device 10. The specific flow and steps of the camera model correction method shown in fig. 2 are explained in detail below.
And step S230, detecting the current imaging magnification numerical value of the real camera in real time.
In this embodiment, the AR implementing device 10 may detect and acquire the current imaging magnification value of the real camera in real time by sending a magnification acquisition request for acquiring the imaging magnification value currently used by the real camera to the real camera in real time, so as to determine whether the imaging magnification value of the real camera changes.
Step S240, obtaining a target correction shooting parameter matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameter of the virtual camera model.
In this embodiment, after the AR implementation apparatus 10 obtains the current imaging magnification value of the real camera, the target correction shooting parameter matching the current imaging magnification value of the real camera is obtained by searching a correction shooting parameter matching the current imaging magnification value of the real camera in a stored parameter correction curve between the imaging magnification value of the real camera and the correction shooting parameter of the virtual camera model.
And step S250, adjusting the shooting parameter value currently used by the virtual camera model according to the target correction shooting parameter so as to enable the virtual camera model to be correspondingly matched with the real camera.
In this embodiment, when the imaging magnification value of the real camera changes and the AR implementation apparatus 10 acquires the correction shooting parameter matching the changed current imaging magnification value of the real camera in the parameter correction curve, the AR implementation apparatus 10 corrects the shooting parameter currently used by the virtual camera model in a manner of adjusting the shooting parameter value currently used by the virtual camera model to the value of the target correction shooting parameter, so that the virtual camera model after shooting parameter correction matches the real camera in real time, the fusion deviation between the position coordinate of the object shot by the real camera in the screen coordinate system of the real camera and the position coordinate of the object in the world coordinate system of the virtual camera model after shooting parameter correction is smaller, therefore, the integral fusion deviation between the virtual camera model and the real camera is reduced, and the AR fusion accuracy is increased, so that a display picture with better AR effect can be correspondingly obtained after the display page of the AR implementation device 10 is refreshed according to the corrected shooting parameters and the current imaging magnification numerical value.
Fig. 3 is a schematic flow chart of a camera model correction method according to an embodiment of the present disclosure. In this embodiment of the application, before the step S230, the camera model correction method may further include a step S210 and a step S220.
Step S210, testing the corresponding corrected shooting parameters of the virtual camera model under different imaging magnifications of the real camera.
Optionally, please refer to fig. 4, which is a flowchart illustrating the sub-steps included in step S210 shown in fig. 3. In this embodiment, the step S210 may include substeps S211 to substep S216.
And a substep S211, after adjusting the imaging magnification of the real camera each time, acquiring a first shooting position of the target object under the current imaging magnification in a screen coordinate system corresponding to the real camera.
In this embodiment, the AR implementing device 10 may create an image recognition area at the center of the screen of the real camera, so as to determine an image of a certain target object through the image recognition area, and accordingly obtain the image feature of the target object and the position coordinate of the corresponding first shooting position of the target object in the screen coordinate system corresponding to the real camera. After the AR implementation device 10 adjusts the current imaging magnification of the real camera according to different imaging magnification values each time, a target object of which the current image is located in the image recognition area is determined based on the image recognition area, and a first shooting position of the target object in a screen coordinate system corresponding to the real camera under the current imaging magnification is obtained.
And a substep S212, controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target real object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position.
In this embodiment, the first target shooting position may be located at an edge of a screen of the real camera, or may be located at an upper left corner of the screen of the real camera, and the specific position information may be configured differently according to requirements.
And a substep S213, when the target real object is located at the target first shooting position in the screen coordinate system, obtaining a corresponding second shooting position of the target real object in the world coordinate system corresponding to the virtual camera model under the shooting parameters currently used by the virtual camera model.
In this embodiment, when the real camera and the virtual camera model perform rotation shooting according to the same shooting operation, the corresponding position of the target real object in the world coordinate system corresponding to the virtual camera model under the action of the currently used shooting parameters of the virtual camera model will change certainly. And when the target real object is located at the first shooting position of the target in the screen coordinate system and stops moving, the position of the target real object in the world coordinate system under the current shooting parameters of the virtual camera model also must stop moving, and at this time, the position of the target real object in the world coordinate system is the second shooting position.
And a substep S214, obtaining a target second shooting position corresponding to the second shooting position in the screen coordinate system according to the conversion relation between the world coordinate and the screen coordinate.
In this embodiment, the AR implementing device 10 obtains a target second shooting position corresponding to the second shooting position in the screen coordinate system by performing coordinate transformation on the second shooting position according to a transformation relationship between the world coordinate and the screen coordinate. Wherein, if the coordinates (X1, Y1) in the screen coordinate system and the origin coordinates (0,0) in the world coordinate system correspond to each other, the position coordinates corresponding to the first target coordinates (X1, Y1) in the screen coordinate system when converted into the world coordinate system are (X1-X1, Y1-Y1), and the position coordinates corresponding to the second target coordinates (X2, Y2) in the world coordinate system when converted into the screen coordinate system are (X2+ X1, Y2+ Y1). In one implementation of this embodiment, the AR implementing apparatus 10 corresponds the screen center coordinates of the real camera in the screen coordinate system and the origin coordinates of the virtual camera model in the world coordinate system to each other.
And a substep S215 of comparing the first shooting position of the target with the second shooting position of the target under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident, until the second shooting position of the target under the adjusted shooting parameters is coincident with the first shooting position of the target.
In this embodiment, when the second target shooting position under the current shooting parameter is within the screen coverage of the real camera and does not coincide with the first target shooting position, the AR implementation apparatus 10 decreases the current shooting angle of view of the virtual camera model or increases the current shooting focal length of the virtual camera model, obtains the second target shooting position currently corresponding to the real object again based on the adjusted shooting parameter, and performs the position comparison between the second target shooting position and the first target shooting position again until the second target shooting position under the adjusted shooting parameter coincides with the first target shooting position.
When the second target shooting position under the current shooting parameter is outside the screen coverage of the real camera and is not coincident with the first target shooting position, the AR implementation apparatus 10 increases the current shooting field angle of the virtual camera model or decreases the current shooting focal length of the virtual camera model, and obtains the second target shooting position currently corresponding to the real object based on the adjusted shooting parameter again, and performs the position comparison between the second target shooting position and the first target shooting position again until the second target shooting position under the adjusted shooting parameter is coincident with the first target shooting position.
And a substep S216 of using the shooting parameter used when the target second shooting position coincides with the target first shooting position as a corrected shooting parameter corresponding to the current imaging magnification.
Referring to fig. 3 again, in step S220, the parameter correction curve is formed based on the corrected shooting parameters obtained by the test.
Optionally, please refer to fig. 5, which is a flowchart illustrating the sub-steps included in step S220 shown in fig. 3. In this embodiment, the step S220 may include a sub-step S221 and a sub-step S222.
And a substep S221, sorting all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sorted parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter.
And a substep S222, drawing linear variation curves between two adjacent sets of parameter correction data in the sorted sets of parameter correction data, and performing curve adjustment on the drawn linear variation curves to obtain the parameter correction curves.
In this embodiment, the linear variation curve is a linear curve having an imaging magnification as an independent variable and a correction shooting parameter as a dependent variable, and the AR implementation apparatus 10 obtains the linear variation curves between two adjacent sets of parameter correction data in the sorted sets of parameter correction data, and then obtains the parameter correction curves including all the linear variation curves by curve-stitching the plurality of drawn linear variation curves in the same coordinate system. The AR implementation device 10 may implement the sorting process of all the corrected shooting parameters obtained by the test in an ascending manner, or implement the sorting process of all the corrected shooting parameters obtained by the test in a descending manner, where the specific sorting manner may be configured differently according to the requirements.
In an implementation manner of this embodiment, the curve adjustment includes curve smoothing in addition to curve splicing, and after the AR implementation device 10 performs curve splicing processing on a plurality of linear variation curves, the parameter correction curve is obtained by performing curve smoothing processing on a spliced curve.
Fig. 6 is a first block diagram of a camera model correction apparatus 100 according to an embodiment of the present disclosure. In the embodiment of the present application, the camera model correction apparatus 100 includes a magnification detection module 130, a parameter acquisition module 140, and a parameter adjustment module 150.
The magnification detection module 130 is configured to detect a current imaging magnification value of the real camera in real time.
In this embodiment, the magnification detection module 130 may execute step S230 in fig. 2, and the detailed description may refer to the above detailed description of step S230.
The parameter obtaining module 140 is configured to obtain a target correction shooting parameter matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameter of the virtual camera model.
In this embodiment, the parameter obtaining module 140 may execute step S240 in fig. 2, and the detailed description may refer to the above detailed description of step S240.
The parameter adjusting module 150 is configured to adjust a shooting parameter value currently used by the virtual camera model according to the target correction shooting parameter, so that the virtual camera model is correspondingly matched with the real camera.
In this embodiment, the parameter adjusting module 150 may execute step S250 in fig. 2, and the detailed description may refer to the above detailed description of step S250.
Fig. 7 is a second block diagram of the camera model correction apparatus 100 according to the embodiment of the present disclosure. In the embodiment of the present application, the camera model correction apparatus 100 may further include a parameter testing module 110 and a curve drawing module 120.
The parameter testing module 110 is configured to test corrected shooting parameters of the virtual camera model under different imaging magnifications of the real camera.
In this embodiment, the manner of testing the corresponding corrected shooting parameters of the virtual camera model under different imaging magnifications of the real camera by the parameter testing module 110 includes:
after the imaging magnification of the real camera is adjusted each time, acquiring a first shooting position of a target object under the current imaging magnification in a screen coordinate system corresponding to the real camera;
controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position;
when the target real object is located at the target first shooting position in the screen coordinate system, acquiring a second shooting position, corresponding to the target real object, in a world coordinate system corresponding to the virtual camera model under the current shooting parameters of the virtual camera model;
obtaining a corresponding target second shooting position of the second shooting position in the screen coordinate system according to a conversion relation between the world coordinate and the screen coordinate;
comparing the position of the target first shooting position with the position of the target second shooting position under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident until the target second shooting position under the adjusted shooting parameters is coincident with the target first shooting position;
and taking the shooting parameters used when the second target shooting position is superposed with the first target shooting position as correction shooting parameters corresponding to the current imaging magnification.
The parameter testing module 110 can perform step S210 in fig. 3 and sub-steps S211 to S216 in fig. 4, and the detailed description can refer to the above detailed description of step S210 and sub-steps S211 to S216.
The curve drawing module 120 is configured to draw and form the parameter correction curve based on the corrected shooting parameters obtained through the test.
In this embodiment, the manner of drawing the parameter correction curve by the curve drawing module 120 based on the corrected shooting parameters obtained by the test includes:
sequencing all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sequenced parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter;
and drawing linear change curves between two adjacent groups of parameter correction data in the sorted multiple groups of parameter correction data, and performing curve adjustment on the drawn linear change curves to obtain the parameter correction curves.
The curve-plotting module 120 can execute step S220 in fig. 3 and sub-steps S221 and S222 in fig. 4, and the detailed description can refer to the above detailed description of step S220, sub-step S221 and sub-step S222.
An embodiment of the present application further provides a readable storage medium, where a computer program is stored, and when the computer program runs, the computer program controls the AR implementation device 10 where the readable storage medium is located to execute the above-mentioned reward distribution method. The readable storage medium may be any available medium that can be accessed by the AR implementing device 10 (e.g., a personal computer, server, etc.) or a data storage device including one or more integrated servers, data centers, etc. of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others, in which the program code may be stored.
In summary, in the camera model correction method, the camera model correction device, the AR implementation device, and the readable storage medium provided in the embodiments of the present application, the camera model correction method can correct the virtual camera model according to the current physical characteristics of the real camera, so that the virtual camera model is matched with the real camera in real time, thereby increasing the AR fusion accuracy and improving the AR effect. Firstly, the method detects the current imaging magnification numerical value of the real camera in real time; then, according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model, acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera; finally, the method adjusts the current shooting parameter value of the virtual camera model according to the target correction shooting parameter, so that the virtual camera model is matched with the real camera in real time, the fusion deviation between the virtual camera model and the real camera is reduced, and the AR fusion precision is increased, thereby improving the corresponding AR effect.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A camera model correction method applied to an Augmented Reality (AR) implementation device connected to a real camera for capturing an image, the AR implementation device having stored therein a virtual camera model whose capturing operation is synchronized with that of the real camera, the method comprising:
detecting the current imaging magnification numerical value of the real camera in real time;
acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model;
and adjusting the current shooting parameter value used by the virtual camera model according to the target correction shooting parameter so as to enable the virtual camera model to be correspondingly matched with the real camera.
2. The method of claim 1, further comprising:
testing correction shooting parameters corresponding to the virtual camera model under different imaging magnifications of the real camera;
and drawing and forming the parameter correction curve based on the corrected shooting parameters obtained by the test.
3. The method of claim 2, wherein the step of testing the corresponding corrected shooting parameters of the virtual camera model at different imaging magnifications of the real camera comprises:
after the imaging magnification of the real camera is adjusted each time, acquiring a first shooting position of a target object under the current imaging magnification in a screen coordinate system corresponding to the real camera;
controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position;
when the target real object is located at the target first shooting position in the screen coordinate system, acquiring a second shooting position, corresponding to the target real object, in a world coordinate system corresponding to the virtual camera model under the current shooting parameters of the virtual camera model;
obtaining a corresponding target second shooting position of the second shooting position in the screen coordinate system according to a conversion relation between the world coordinate and the screen coordinate;
comparing the position of the target first shooting position with the position of the target second shooting position under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident until the target second shooting position under the adjusted shooting parameters is coincident with the target first shooting position;
and taking the shooting parameters used when the second target shooting position is superposed with the first target shooting position as correction shooting parameters corresponding to the current imaging magnification.
4. The method according to claim 2 or 3, wherein the step of forming the parametric correction curve based on the tested corrected shooting parameters drawing comprises:
sequencing all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sequenced parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter;
and drawing linear change curves between two adjacent groups of parameter correction data in the sorted multiple groups of parameter correction data, and performing curve adjustment on the drawn linear change curves to obtain the parameter correction curves.
5. A camera model correction apparatus applied to an AR implementation device connected to a real camera for capturing an image, the AR implementation device having stored therein a virtual camera model whose capturing operation is synchronized with that of the real camera, the apparatus comprising:
the magnification detection module is used for detecting the current imaging magnification numerical value of the real camera in real time;
the parameter acquisition module is used for acquiring target correction shooting parameters matched with the current imaging magnification value of the real camera according to a stored parameter correction curve between the imaging magnification of the real camera and the correction shooting parameters of the virtual camera model;
and the parameter adjusting module is used for adjusting the shooting parameter value currently used by the virtual camera model according to the target correction shooting parameter so as to enable the virtual camera model to be correspondingly matched with the real camera.
6. The apparatus of claim 5, further comprising:
the parameter testing module is used for testing the corrected shooting parameters corresponding to the virtual camera model under different imaging magnifications of the real camera;
and the curve drawing module is used for drawing and forming the parameter correction curve based on the corrected shooting parameters obtained by the test.
7. The apparatus of claim 6, wherein the parameter testing module is specifically configured to:
after the imaging magnification of the real camera is adjusted each time, acquiring a first shooting position of a target object under the current imaging magnification in a screen coordinate system corresponding to the real camera;
controlling the real camera and the virtual camera model to perform rotary shooting according to the same shooting operation, so that the target object is moved to a target first shooting position in the screen coordinate system at a corresponding first shooting position;
when the target real object is located at the target first shooting position in the screen coordinate system, acquiring a second shooting position, corresponding to the target real object, in a world coordinate system corresponding to the virtual camera model under the current shooting parameters of the virtual camera model;
obtaining a corresponding target second shooting position of the second shooting position in the screen coordinate system according to a conversion relation between the world coordinate and the screen coordinate;
comparing the position of the target first shooting position with the position of the target second shooting position under the current shooting parameters, and adjusting the shooting parameters currently used by the virtual camera model when the positions are not coincident until the target second shooting position under the adjusted shooting parameters is coincident with the target first shooting position;
and taking the shooting parameters used when the second target shooting position is superposed with the first target shooting position as correction shooting parameters corresponding to the current imaging magnification.
8. The apparatus according to claim 6 or 7, wherein the curve-plotting module is specifically configured to:
sequencing all the tested corrected shooting parameters according to the imaging magnification value corresponding to each corrected shooting parameter to obtain a plurality of groups of sequenced parameter correction data, wherein each group of parameter correction data comprises the corresponding imaging magnification value and the corrected shooting parameter;
and drawing linear change curves between two adjacent groups of parameter correction data in the sorted multiple groups of parameter correction data, and performing curve adjustment on the drawn linear change curves to obtain the parameter correction curves.
9. An AR implementation device comprising a processor and a non-volatile memory storing computer instructions, which when executed by the processor, perform the camera model correction method of any one of claims 1 to 4, wherein the AR implementation device is connected to a real camera for capturing an image, and the AR implementation device stores therein a virtual camera model whose capturing operation is synchronized with that of the real camera.
10. A readable storage medium, characterized in that the readable storage medium comprises a computer program which, when executed, controls an AR implementing device in which the readable storage medium is located to perform the camera model correction method according to any one of claims 1 to 4.
CN201811524847.4A 2018-12-13 2018-12-13 Camera model correction method, device, AR implementation equipment and readable storage medium Active CN111325798B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811524847.4A CN111325798B (en) 2018-12-13 2018-12-13 Camera model correction method, device, AR implementation equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811524847.4A CN111325798B (en) 2018-12-13 2018-12-13 Camera model correction method, device, AR implementation equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN111325798A true CN111325798A (en) 2020-06-23
CN111325798B CN111325798B (en) 2023-08-18

Family

ID=71172253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811524847.4A Active CN111325798B (en) 2018-12-13 2018-12-13 Camera model correction method, device, AR implementation equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN111325798B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132909A (en) * 2020-09-23 2020-12-25 字节跳动有限公司 Parameter acquisition method and device, media data processing method and storage medium
CN113905145A (en) * 2021-10-11 2022-01-07 浙江博采传媒有限公司 LED circular screen virtual-real camera focus matching method and system
CN114040090A (en) * 2021-08-25 2022-02-11 先壤影视制作(上海)有限公司 Method, device, equipment, storage medium, acquisition part and system for synchronizing virtuality and reality
WO2022040983A1 (en) * 2020-08-26 2022-03-03 南京翱翔智能制造科技有限公司 Real-time registration method based on projection marking of cad model and machine vision
CN114422696A (en) * 2022-01-19 2022-04-29 浙江博采传媒有限公司 Virtual shooting method and device and storage medium
CN116320363A (en) * 2023-05-25 2023-06-23 四川中绳矩阵技术发展有限公司 Multi-angle virtual reality shooting method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
US20140300635A1 (en) * 2011-11-09 2014-10-09 Sony Corporation Information processing apparatus, display control method, and program
US20150304531A1 (en) * 2012-11-26 2015-10-22 Brainstorm Multimedia, S.L. A method for obtaining and inserting in real time a virtual object within a virtual scene from a physical object
US20170301137A1 (en) * 2016-04-15 2017-10-19 Superd Co., Ltd. Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality
CN108492356A (en) * 2017-02-13 2018-09-04 苏州宝时得电动工具有限公司 Augmented reality system and its control method
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN108537889A (en) * 2018-03-26 2018-09-14 广东欧珀移动通信有限公司 Method of adjustment, device, storage medium and the electronic equipment of augmented reality model
CN108553889A (en) * 2018-03-29 2018-09-21 广州汉智网络科技有限公司 Dummy model exchange method and device
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (en) * 2010-12-10 2011-08-17 北京大学 Method for carrying out dynamic texture acquisition and virtuality-reality fusion by using mobile shooting equipment
US20140300635A1 (en) * 2011-11-09 2014-10-09 Sony Corporation Information processing apparatus, display control method, and program
US20150304531A1 (en) * 2012-11-26 2015-10-22 Brainstorm Multimedia, S.L. A method for obtaining and inserting in real time a virtual object within a virtual scene from a physical object
US20170301137A1 (en) * 2016-04-15 2017-10-19 Superd Co., Ltd. Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality
CN108492356A (en) * 2017-02-13 2018-09-04 苏州宝时得电动工具有限公司 Augmented reality system and its control method
CN108520552A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN108537889A (en) * 2018-03-26 2018-09-14 广东欧珀移动通信有限公司 Method of adjustment, device, storage medium and the electronic equipment of augmented reality model
CN108553889A (en) * 2018-03-29 2018-09-21 广州汉智网络科技有限公司 Dummy model exchange method and device
CN108986199A (en) * 2018-06-14 2018-12-11 北京小米移动软件有限公司 Dummy model processing method, device, electronic equipment and storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022040983A1 (en) * 2020-08-26 2022-03-03 南京翱翔智能制造科技有限公司 Real-time registration method based on projection marking of cad model and machine vision
CN112132909A (en) * 2020-09-23 2020-12-25 字节跳动有限公司 Parameter acquisition method and device, media data processing method and storage medium
CN112132909B (en) * 2020-09-23 2023-12-22 字节跳动有限公司 Parameter acquisition method and device, media data processing method and storage medium
CN114040090A (en) * 2021-08-25 2022-02-11 先壤影视制作(上海)有限公司 Method, device, equipment, storage medium, acquisition part and system for synchronizing virtuality and reality
CN113905145A (en) * 2021-10-11 2022-01-07 浙江博采传媒有限公司 LED circular screen virtual-real camera focus matching method and system
CN114422696A (en) * 2022-01-19 2022-04-29 浙江博采传媒有限公司 Virtual shooting method and device and storage medium
CN116320363A (en) * 2023-05-25 2023-06-23 四川中绳矩阵技术发展有限公司 Multi-angle virtual reality shooting method and system

Also Published As

Publication number Publication date
CN111325798B (en) 2023-08-18

Similar Documents

Publication Publication Date Title
CN111325798B (en) Camera model correction method, device, AR implementation equipment and readable storage medium
US9727775B2 (en) Method and system of curved object recognition using image matching for image processing
US20150103183A1 (en) Method and apparatus for device orientation tracking using a visual gyroscope
EP2920758B1 (en) Rotation of an image based on image content to correct image orientation
WO2014187223A1 (en) Method and apparatus for identifying facial features
CN109840883B (en) Method and device for training object recognition neural network and computing equipment
CN112991180B (en) Image stitching method, device, equipment and storage medium
CN110660102B (en) Speaker recognition method, device and system based on artificial intelligence
CN110062157B (en) Method and device for rendering image, electronic equipment and computer readable storage medium
CN113920502A (en) Cloud deck adjusting method, device, equipment and medium
CN111339884A (en) Image recognition method and related equipment and device
CN113344789B (en) Image splicing method and device, electronic equipment and computer readable storage medium
CN114627244A (en) Three-dimensional reconstruction method and device, electronic equipment and computer readable medium
CN113077524B (en) Automatic calibration method, device and equipment for binocular fisheye camera and storage medium
CN113158773B (en) Training method and training device for living body detection model
CN112258647B (en) Map reconstruction method and device, computer readable medium and electronic equipment
CN112102404A (en) Object detection tracking method and device and head-mounted display equipment
CN111815748A (en) Animation processing method and device, storage medium and electronic equipment
CN108780572A (en) The method and device of image rectification
CN111062374A (en) Identification method, device, system, equipment and readable medium of identity card information
CN116048682A (en) Terminal system interface layout comparison method and electronic equipment
CN113902932A (en) Feature extraction method, visual positioning method and device, medium and electronic equipment
CN115082496A (en) Image segmentation method and device
CN111353929A (en) Image processing method and device and electronic equipment
CN113706429B (en) Image processing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant