CN112132909A - Parameter acquisition method and device, media data processing method and storage medium - Google Patents

Parameter acquisition method and device, media data processing method and storage medium Download PDF

Info

Publication number
CN112132909A
CN112132909A CN202011011407.6A CN202011011407A CN112132909A CN 112132909 A CN112132909 A CN 112132909A CN 202011011407 A CN202011011407 A CN 202011011407A CN 112132909 A CN112132909 A CN 112132909A
Authority
CN
China
Prior art keywords
electronic device
field angle
parameter acquisition
parameter
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011011407.6A
Other languages
Chinese (zh)
Other versions
CN112132909B (en
Inventor
杨骁�
刘晶
吕晴阳
罗琳捷
陈志立
陈怡�
连晓晨
王国晖
杨建朝
任龙
刘舒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ByteDance Inc
Original Assignee
ByteDance Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ByteDance Inc filed Critical ByteDance Inc
Priority to CN202011011407.6A priority Critical patent/CN112132909B/en
Publication of CN112132909A publication Critical patent/CN112132909A/en
Application granted granted Critical
Publication of CN112132909B publication Critical patent/CN112132909B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06CDIGITAL COMPUTERS IN WHICH ALL THE COMPUTATION IS EFFECTED MECHANICALLY
    • G06C3/00Arrangements for table look-up, e.g. menstruation table
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens

Abstract

A parameter acquisition method, a media data processing method, a parameter acquisition apparatus, and a non-transitory computer-readable storage medium. The parameter acquisition method comprises the following steps: acquiring characteristic information of an electronic device, wherein the electronic device comprises an image shooting device; determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device; and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rule, wherein the conversion parameters comprise an internal reference matrix corresponding to the image shooting device.

Description

Parameter acquisition method and device, media data processing method and storage medium
Technical Field
Embodiments of the present disclosure relate to a parameter acquisition method, a media data processing method, a parameter acquisition apparatus, and a non-transitory computer-readable storage medium.
Background
The short video has the characteristics of strong social attribute, easy creation and short time, and is more in line with the consumption habit of fragmented content of users in the mobile internet era. Augmented Reality (AR) technology is a technology that ingeniously fuses virtual information and the real world, and widely uses the fields of multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like, and applies virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer to the real world after analog simulation, so that the information of the real world and the virtual information complement each other, and the real world is enhanced. The special virtual-real fusion effect of the AR determines that the AR has an infinite expansion space in the short video field.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
At least one embodiment of the present disclosure provides a parameter obtaining method, including: acquiring characteristic information of an electronic device, wherein the electronic device comprises an image shooting device; determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device; and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rule, wherein the conversion parameters comprise an internal reference matrix corresponding to the image shooting device.
At least one embodiment of the present disclosure further provides a parameter obtaining apparatus, including: a memory for non-transitory storage of computer readable instructions; a processor configured to execute the computer-readable instructions, wherein the computer-readable instructions, when executed by the processor, implement the parameter obtaining method according to any embodiment of the present disclosure.
At least one embodiment of the present disclosure further provides a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer-readable instructions, and when executed by a processor, the computer-readable instructions implement the parameter obtaining method according to any one of the embodiments of the present disclosure.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Fig. 1 is a schematic flow chart of a parameter obtaining method according to at least one embodiment of the present disclosure;
fig. 2 is a schematic diagram of a media data processing method according to an embodiment of the disclosure;
fig. 3 is a schematic block diagram of a parameter obtaining apparatus according to at least one embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a non-transitory computer-readable storage medium provided in at least one embodiment of the present disclosure; and
fig. 5 is a schematic structural diagram of an electronic device according to at least one embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are inclusive, i.e., "including but not limited to. The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in the present disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that reference to "one or more" unless the context clearly dictates otherwise; "plurality" means two or more.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
To maintain the following description of the embodiments of the present disclosure clear and concise, a detailed description of some known functions and components have been omitted from the present disclosure.
The working principle of the AR special effect of the landmark comprises the following steps: mapping the reconstructed 3D model of the target object into a virtual three-dimensional space; and then projecting the 3D model from the virtual three-dimensional space to a screen of a mobile terminal such as a mobile phone and adding an AR special effect, thereby realizing the AR special effect of the landmark. The projection process requires the use of the camera's internal reference matrix (mainly including the field angle (fov)) in the cell phone.
After the 3D model is subjected to radial transformation, the position of the 3D model in a virtual three-dimensional space corresponding to the three-dimensional coordinate system determined by a camera of the mobile phone can be obtained. The 3D model then needs to be projected from the virtual three-dimensional space onto the two-dimensional plane of the screen of the handset. For any point p in the 3D model3The projection process can be expressed as (x, y, z): p ═ K ═ P3. Through the projection process, P ═ can be finally obtained (x1, y1, z1), and P is normalized to obtain two-dimensional coordinates (x1/z1, y1/z1) on a two-dimensional plane of the screen, where K is called an internal reference matrix and is expressed as:
Figure BDA0002697670790000031
where fx and fy are focal lengths of the cameras, x0 and y0 are optical axis offsets of the cameras, and s is an axis skew. In the landmark AR special effect function, data that is easily inaccurate is the angle of view, and what the angle of view affects is the camera focal length. The method of calculating the focal length by the field angle is (assuming that the focal lengths in the x-direction and the y-direction are the same, that is, fx ═ fy ═ f):
f=max(x0,y0)/tan(fov/2),
here, fov denotes the angle of view.
Therefore, when the field angle is inaccurate, the inaccuracy also causes the focal length of the camera to be inaccurate, so that the position of the 3D model is deviated when the 3D model is projected onto the two-dimensional plane of the screen of the mobile phone. When the field angle is smaller, the focal length is larger, so that the AR effect projected on the screen of the mobile phone is stretched, and when the mobile phone is rotated, the rotation of the AR special effect is faster than that of the mobile phone; on the contrary, when the angle of view is too large, the focal length is too small, which results in the AR effect projected on the screen of the mobile phone being compressed, and when the mobile phone is rotated, the rotation of the AR special effect is slower than that of the mobile phone.
If the internal reference matrices of all cameras of the electronic device adopt fixed values during the application process, the following problems may occur because the fixed internal reference matrices do not coincide with the actual internal reference matrices of the respective cameras: (1) AR effects may be stretched or compressed; (2) when the mobile phone rotates, the rotation of the AR special effect is faster or slower than that of the video frame, so that the AR special effect cannot be aligned with the landmark building in the video frame.
That is, currently, since the internal reference matrices of the cameras of all the electronic devices adopt fixed values, the AR effect of each electronic device cannot be aligned with the landmark building in the video frame. For an electronic device with an iOS system, although the field angle can be obtained by looking up a parameter table of the official website of Apple corporation or by calling an Application Program Interface (API) provided by AVFoundation, because the landmark AR special effect requires turning off the video anti-shake function, under the condition that the video is not anti-shake, the inventor finds that the field angle data returned by AVFoundation is not accurate, and thus the truly accurate field angle cannot be obtained by directly calling the Application Program Interface provided by AVFoundation.
The problem of inaccurate reference matrix of the camera of the current mobile phone will be briefly described below.
For the IOS system, official documents provide the viewing angle under video conditions for different models (viewing angle decision internal reference matrix). In actual tests, the inventor found that in the case of turning off the video anti-shake function, there is a problem that the angle of view actually measured does not coincide with the angle of view provided by the official. In order to obtain an actual correct angle of view, the inventor uses a standard function calibretacarama used for calculating camera parameters in an opencv library to calculate the angle of view of different mobile phones, and the obtained examples are shown in table 1 below. For example, for Iphone 6, the actual calculated field of view differs by 6.5 ° from the field of view given by the official website.
Table 1
Figure BDA0002697670790000041
The inventor also found at the time of testing: the field angle provided by the software for realizing the ARKit function of the mobile phone is more accurate.
For the android system, because the number and the types of the mobile phones with the android system, which are released by various manufacturers, are large, the field angles of all the mobile phones cannot be measured one by one. The android system provides Camera APIs (i.e., Camera APIs including Camera1 API and Camera2 API) that can return the angle of view of the cell phone. The field angle can be obtained by calling the Camera API, so that the internal reference matrix of the mobile phone is obtained. At the time of testing, the inventors found: the internal reference matrix carried by the mobile phone of many models is inaccurate, and the field angle is obviously too large or too small, which is obviously not practical.
At least one embodiment of the present disclosure provides a parameter acquisition method, a media data processing method, a parameter acquisition apparatus, and a non-transitory computer-readable storage medium. The parameter acquisition method comprises the following steps: acquiring characteristic information of an electronic device, wherein the electronic device comprises an image shooting device; determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device; and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rule, wherein the conversion parameters comprise an internal reference matrix corresponding to the image shooting device.
According to the parameter acquisition method, the characteristic parameters corresponding to the electronic device are determined according to the characteristic information (such as the type and model of a platform (namely an operating system) of the electronic device, so that the obtained characteristic parameters are more accurate and are closer to the actual characteristic parameters of the electronic device, the problem that the fixed characteristic parameters or the field angle acquired by calling a parameter interface of the electronic device is inaccurate can be solved, therefore, when the electronic device realizes the landmark AR special effect, the AR special effect is more accurately aligned with the landmark building in the video frame, and a better visual effect of the AR special effect can be provided for a user.
It should be noted that the parameter obtaining method provided by the embodiment of the present disclosure may be applied to the parameter obtaining device provided by the embodiment of the present disclosure, and the parameter obtaining device may be configured on the electronic device, for example, the parameter obtaining device may also be configured in an application program of the electronic device. The electronic device may be a personal computer, a mobile terminal, and the like, and the mobile terminal may be a hardware device having various operating systems, such as a mobile phone and a tablet computer. For example, the application may be a video application including, but not limited to, a tremble, and the like.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings, but the present disclosure is not limited to these specific embodiments.
Fig. 1 is a schematic flow chart of a parameter obtaining method according to at least one embodiment of the present disclosure.
For example, as shown in fig. 1, the parameter acquisition method includes steps S10 to S12.
Step S10: acquiring characteristic information of the electronic device;
step S11: determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device;
step S12: and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rules.
For example, the electronic apparatus includes an image capturing device for capturing an image and/or video or the like. The image photographing device may include a camera, a video camera, and the like. The image capturing device may be provided integrally with the electronic apparatus, and the image capturing device may be provided separately from the electronic apparatus and communicatively connected to the electronic apparatus by wireless (e.g., bluetooth, etc.) or wired, etc.
For example, in step S10, the feature information of the electronic device includes an operating system, a model, and the like of the electronic device. For example, the operating systems may include a first operating system and a second operating system, and so on; for example, the first operating system may be various versions of the android system, the second operating system may be various versions of the IOS system, and so on. The models may include OPPO R9, OPPO R11, Hua is P40, Hua is nova7, Rongyao 30, IPhone 6, IPhone8, IPhone 11, IPhone SE, iPad mini, and the like.
For example, the feature information of the electronic device may be acquired through an interface inside the electronic device for acquiring the feature information.
For example, in step S11, different electronic devices correspond to different parameter obtaining rules, so that the feature parameters obtained based on the parameter obtaining rules are more accurate and closer to the actual feature parameters of the electronic devices.
For example, the conversion parameter includes an internal reference matrix corresponding to the image capturing device, and the internal reference matrix is determined by the angle of view of the image capturing device, so that when the actual angle of view corresponding to the image capturing device is obtained, the actual internal reference matrix corresponding to the image capturing device can be obtained.
For example, in some embodiments, when the feature information of the electronic device indicates that the operating system of the electronic device is the first operating system, step S12 may include: acquiring a preset field angle provided by a field angle interface of the electronic device according to the parameter acquisition rule: responding to the preset field angle within the set field angle range, and taking the preset field angle as a target field angle according to the parameter acquisition rule; in response to the fact that the preset field angle is not within the set field angle range, taking the default field angle as a target field angle according to the parameter acquisition rule; and determining conversion parameters according to the target field angle.
For example, in the embodiment of the present disclosure, when the operating system of the electronic device is the first operating system, the parameter acquisition rule represents: acquiring a preset field angle provided by a field angle interface (i.e. an application program interface of the image capturing device, such as the Camera API) of the electronic device, and taking the preset field angle as a target field angle when the preset field angle is within a set field angle range, and taking a default field angle as a target field angle when the preset field angle is not within the set field angle range; and determining conversion parameters according to the target field angle.
For example, the field angle range is set to a range of 45 degrees to 80 degrees. It should be noted that the present disclosure is not limited to this, and the set field angle range may be set according to actual circumstances.
For example, the default angle of view is calculated by counting and calculating a preset angle of view provided by the angle of view interfaces of the plurality of electronic devices having the first operating system, and in some embodiments, the default angle of view is calculated by counting the returned angles of view of the camera API of one seventy-hundred electronic devices having the first operating system on the market, and averaging (e.g., arithmetically averaging) the remaining angles of view after removing angles of view that exceed the set angle of view. For example, in some embodiments, the default angle of view is between 63-65 degrees (°), such as 64 degrees, for better effect, and the disclosure is not limited thereto, and the default angle of view may be determined according to actual conditions.
For example, in some embodiments, when the feature information of the electronic device indicates that the operating system of the electronic device is the second operating system, step S12 may include: responding to the fact that the electronic device has an ARKit function, and according to a parameter obtaining rule, obtaining an internal reference matrix of an image shooting device provided by ARKit software for realizing the ARKit function in the electronic device as a conversion parameter; and responding to the fact that the electronic device does not have the ARKit function and the model of the electronic device falls into a preset model list, acquiring a measured field angle corresponding to the electronic device from the field angle lookup table as a target field angle according to the parameter acquisition rule, and determining conversion parameters according to the target field angle.
For example, in some embodiments of the present disclosure, when the operating system of the electronic device is the second operating system, the parameter obtaining rule represents: when the electronic device has an ARKit function, acquiring an internal reference matrix of an image shooting device provided by ARKit software for realizing the ARKit function in the electronic device as a conversion parameter; and when the electronic device does not have the ARKit function and the model of the electronic device falls into a preset model list, acquiring a measured field angle corresponding to the electronic device from the field angle lookup table as a target field angle, and determining conversion parameters according to the target field angle.
For example, in the present embodiment, all the measured angles of view in the angle of view lookup table are angles of view measured and calculated by the calibretanemura function in the opencv library for all the electronic devices having the second operating system and no ARKit function. The preset model list represents a model list corresponding to all the measured angles of view in the angle of view lookup table.
At present, for the special effect of the landmark AR, in all electronic apparatuses having the second operating system, an electronic apparatus having a model of iphone8 or more supports and may turn on the ARKit function, and thus, for an electronic apparatus having a model of iphone8 or more, an internal reference matrix of an image capturing device provided by ARKit software for implementing the ARKit function in the electronic apparatus is directly acquired as a conversion parameter; on the other hand, since the ARKit function is not supported by the electronic device having a model of iphone8 or less, the viewing angle of the electronic device having a model of iphone8 or less may be measured in advance, and a viewing angle lookup table may be constructed. At this time, the preset model list may represent a list of model configurations below iphone 8.
For example, in other embodiments, when the feature information of the electronic device indicates that the operating system of the electronic device is the second operating system and the model of the electronic device falls into the preset model list, step S12 may include: and according to the parameter acquisition rule, acquiring a measurement field angle corresponding to the electronic device from the field angle lookup table as a target field angle, and determining the conversion parameter according to the target field angle.
For example, in some embodiments of the present disclosure, when the operating system of the electronic device is the second operating system, the parameter obtaining rule indicates: and directly acquiring a measurement field angle corresponding to the electronic device from the field angle lookup table as a target field angle, and determining conversion parameters according to the target field angle.
For example, the view angle lookup table is constructed by measuring the view angles of the electronic device having the second operating system and the model falling into the preset model list in advance, for example, in the present embodiment, all the measured view angles in the view angle lookup table are the view angles measured and calculated by the function calibretacarama in the opencv library for the electronic device having the second operating system and the model falling into the preset model list. At this time, the preset model list may be a list of all models having the second operating system.
For example, for an electronic apparatus having the second operating system and an ARKit function, it is sufficient to directly search for a corresponding angle of view from the angle of view lookup table according to the model of the electronic apparatus in use by measuring the angles of view of all electronic apparatuses having the second operating system in advance and constructing the angle of view lookup table, without providing an internal reference matrix of an image capturing device by using ARKit software that implements the ARKit function. The field angle obtained through the function calibretacarama measurement in the opencv library is accurate and is closer to the actual field angle of the electronic device, so that when the electronic device realizes the landmark AR special effect based on the field angle obtained from the field angle lookup table, the AR special effect can be more accurately aligned with a landmark building in a video frame, and a better visual effect of the AR special effect is provided for a user.
For example, the internal reference matrix corresponding to the image capturing device is expressed as:
Figure BDA0002697670790000081
where K is an internal reference matrix, fx and fy are focal lengths corresponding to the image pickup devices, x0 and y0 are optical axis offsets of the image pickup devices, and s is an axis skew. Assuming that the focal lengths in the x direction and the y direction are the same, that is, fx ═ fy ═ f, the focal length corresponding to the image capturing device can be expressed as:
fx=fy=max(x0,y0)/tan(fov/2),
where max () denotes taking the maximum value, tan () denotes taking the tangent value, and fov denotes the target angle of view.
At least one embodiment of the present disclosure further provides a media data processing method. Fig. 2 is a schematic diagram of a media data processing method according to an embodiment of the disclosure.
For example, the media data processing method may be applied to an electronic device, as shown in fig. 2, and the media data processing method includes:
step S20: obtaining a three-dimensional model of the reconstructed object;
step S21: mapping the three-dimensional model into a virtual three-dimensional space;
step S22: acquiring a conversion parameter corresponding to the electronic device according to the parameter acquisition method;
step S23: and projecting the three-dimensional model from the virtual three-dimensional space to a display screen of the electronic device to display the three-dimensional model based on the corresponding conversion parameters of the electronic device.
For example, the media data may include video, images, text, and the like.
For example, in step S20, a three-dimensional model of the object may be reconstructed by various suitable three-dimensional reconstruction techniques. For example, the three-dimensional reconstruction technique may include a vision-based SFM (motion structure from motion) algorithm or the like.
For example, the object may be an outdoor object, such as a landmark building, or an indoor object, such as a table, a cabinet, or the like.
For example, the three-dimensional model may be a three-dimensional point cloud model of the object.
For example, the image capture device may be a camera, and the three-dimensional model may be mapped onto the display screen in a manner similar to the camera calibration. For example, in the process of mapping the three-dimensional model to the display screen, the camera coordinate system is a coordinate system established with reference to the camera, the image coordinate system and the pixel coordinate system are both coordinate systems established with reference to the display screen, the world coordinate system is a coordinate system in which the three-dimensional model is located, and the world coordinate system is a coordinate system set in the process of reconstructing the three-dimensional model of the object. For example, the origin of the camera coordinate system may be located on the camera optical center (i.e., the projection center), and the origin of the image coordinate system may be located on the intersection of the camera's primary optical axis and the imaging plane. The Z-axis of the camera coordinate system may be the main optical axis of the camera, and the X-axis and the Y-axis of the camera coordinate system are parallel to the X-axis and the Y-axis of the image coordinate system, respectively. The X-axis and Y-axis of the image coordinate system are also parallel to the U-axis and V-axis of the pixel coordinate system, respectively. The pixel coordinates of each point in the pixel coordinate system represent the number of columns and rows of pixels.
In mapping the three-dimensional model to the display screen, the three-dimensional model is first converted from the world coordinate system into the camera coordinate system, then from the camera coordinate system into the image coordinate system, and finally from the image coordinate system into the pixel coordinate system.
For example, the virtual three-dimensional space may represent the space represented by the camera coordinate system. The virtual three-dimensional space may be a normalized virtual three-dimensional space, that is, the range of the virtual three-dimensional space on the X-axis, the range on the Y-axis, and the range on the Z-axis of the camera coordinate system are all-1 to 1.
For example, a three-dimensional model of an object is in the world coordinate system.
For example, in step S21, the three-dimensional model is converted from the world coordinate system into the camera coordinate system, and in step S23, the three-dimensional model is converted from the camera coordinate system into the image coordinate system and then into the pixel coordinate system.
For example, in step S23, for an arbitrary point a1 ═ x, y, z in the three-dimensional model, point a1 is projection-transformed to be converted into the image coordinate system, and the projection transformation process is expressed by the following equation:
A2=K*A1。
after the projection transformation, a2 (x1, y1, z1) can be obtained finally, and then the z1 value is normalized, so that two-dimensional coordinates (x1/z1, y1/z1) in an image coordinate system of the display screen are obtained, namely the two-dimensional coordinates in the image coordinate system are converted into a pixel coordinate system.
For example, K is an internal reference matrix obtained by a parameter obtaining method provided by at least one embodiment of the present disclosure, so that K is more accurate and closer to an actual internal reference matrix of the electronic device. For example, K is represented as:
Figure BDA0002697670790000101
where fx and fy are focal lengths corresponding to the image capturing device, fx-fy-max (x0, y0)/tan (fov/2), where fov denotes a target field angle, x0 and y0 are optical axis offset amounts of the image capturing device, and s is an axis skew.
For example, in some embodiments, the media data processing method further comprises: acquiring a virtual special effect model; and superposing the virtual special effect model on the three-dimensional model for displaying, thereby realizing augmented reality display.
For example, in some embodiments, the virtual effect model is an augmented reality effect model, which may include virtual effects such as text, images, three-dimensional models, music, video, and so forth.
At least one embodiment of the present disclosure further provides a parameter obtaining apparatus, and fig. 3 is a schematic block diagram of a parameter obtaining apparatus provided in at least one embodiment of the present disclosure.
For example, as shown in fig. 3, the parameter acquiring device 30 includes a processor 300 and a memory 310. It should be noted that the components of the parameter obtaining apparatus 30 shown in fig. 3 are only exemplary and not limiting, and the parameter obtaining apparatus 30 may have other components according to the actual application requirement.
For example, the processor 300 and the memory 310 may be in direct or indirect communication with each other.
For example, the processor 300 and the memory 310 may communicate over a network connection. The network may include a wireless network, a wired network, and/or any combination of wireless and wired networks. The processor 300 and the memory 310 may also communicate with each other via a system bus, which is not limited by the present disclosure.
For example, memory 310 is used to store computer readable instructions non-transiently. The processor 300 is configured to execute computer readable instructions, and when the computer readable instructions are executed by the processor 300, the parameter obtaining method according to any of the above embodiments is implemented. For specific implementation and related explanation of each step of the parameter obtaining method, reference may be made to the above embodiments of the parameter obtaining method, which are not described herein again.
For example, the processor 300 and the memory 310 may be located on a server side (or cloud side).
For example, the processor 300 may control other components in the parameter acquisition device 30 to perform desired functions. The processor 300 may be a Central Processing Unit (CPU), a Network Processor (NP), etc.; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The Central Processing Unit (CPU) may be an X86 or ARM architecture, etc.
For example, memory 310 may include any combination of one or more computer program products, which may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. Volatile memory can include, for example, Random Access Memory (RAM), cache memory (or the like). The non-volatile memory may include, for example, Read Only Memory (ROM), a hard disk, an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), USB memory, flash memory, and the like. One or more computer-readable instructions may be stored on the computer-readable storage medium and executed by the processor 300 to implement the various functions of the parameter acquisition device 30. Various application programs and various data and the like can also be stored in the storage medium.
For example, for the detailed description of the process of the parameter obtaining apparatus 30 executing the parameter obtaining method, reference may be made to the related description in the embodiment of the parameter obtaining method, and repeated descriptions are omitted here.
Fig. 4 is a schematic diagram of a non-transitory computer-readable storage medium according to at least one embodiment of the disclosure. For example, as shown in FIG. 4, the storage medium 400 is a non-transitory computer-readable storage medium on which one or more computer-readable instructions 410 may be non-temporarily stored on the storage medium 400. For example, the computer readable instructions 410 when executed by a processor may perform one or more steps in accordance with the parameter acquisition method described above.
For example, the storage medium 400 may be applied to the parameter acquisition apparatus 30 described above. For example, the storage medium 400 may include the memory 310 in the parameter acquisition device 30.
For example, the description of the storage medium 400 may refer to the description of the memory 310 in the embodiment of the parameter obtaining apparatus 30, and repeated descriptions are omitted.
Fig. 5 shows a schematic structural diagram of an electronic device (for example, the electronic device may include the parameter obtaining apparatus or the electronic apparatus described in the above embodiments) 600 suitable for implementing the embodiments of the present disclosure. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from storage 606 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 606 including, for example, magnetic tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such embodiments, the computer program may be downloaded and installed from a network through the communication device 609, or installed from the storage device 606, or installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that in the context of this disclosure, a computer-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. The computer-readable storage medium may be, for example, but is not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
According to one or more embodiments of the present disclosure, a parameter obtaining method includes: acquiring characteristic information of an electronic device, wherein the electronic device comprises an image shooting device; determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device; and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rule, wherein the conversion parameters comprise an internal reference matrix corresponding to the image shooting device.
According to one or more embodiments of the present disclosure, the feature information of the electronic device includes an operating system and a model of the electronic device.
According to one or more embodiments of the present disclosure, when the feature information of the electronic device indicates that the operating system of the electronic device is the first operating system, determining a conversion parameter corresponding to the electronic device according to a parameter acquisition rule includes: acquiring a preset field angle provided by a field angle interface of the electronic device according to the parameter acquisition rule: responding to the preset field angle within the set field angle range, and taking the preset field angle as a target field angle according to the parameter acquisition rule; in response to the fact that the preset field angle is not within the set field angle range, taking the default field angle as a target field angle according to the parameter acquisition rule; and determining conversion parameters according to the target field angle.
According to one or more embodiments of the present disclosure, the field angle range is set to a range of 45 degrees to 80 degrees.
According to one or more embodiments of the present disclosure, the default angle of view is calculated by counting and calculating a preset angle of view provided by angle of view interfaces of a plurality of electronic devices having the first operating system, and is 63 to 65 degrees.
According to one or more embodiments of the present disclosure, when the feature information of the electronic device indicates that the operating system of the electronic device is the second operating system, determining a conversion parameter corresponding to the electronic device according to a parameter acquisition rule includes: responding to that the electronic device has an ARKit function, and acquiring an internal reference matrix of an image shooting device provided by ARKit software for realizing the ARKit function in the electronic device as a conversion parameter according to a parameter acquisition rule; and responding to the fact that the electronic device does not have the ARKit function and the model of the electronic device falls into a preset model list, acquiring a measured field angle corresponding to the electronic device from the field angle lookup table as a target field angle according to the parameter acquisition rule, and determining conversion parameters according to the target field angle.
According to one or more embodiments of the present disclosure, when the feature information of the electronic device indicates that the operating system of the electronic device is the second operating system and the model of the electronic device falls into the preset model list, determining a conversion parameter corresponding to the electronic device according to a parameter obtaining rule includes: and according to the parameter acquisition rule, acquiring a measurement field angle corresponding to the electronic device from the field angle lookup table as a target field angle, and determining the conversion parameter according to the target field angle.
According to one or more embodiments of the present disclosure, the viewing angle lookup table is constructed by measuring in advance the viewing angle of the electronic device having the second operating system and the model falling into a preset model list.
According to one or more embodiments of the present disclosure, an internal reference matrix corresponding to an image capturing device is represented as:
Figure BDA0002697670790000151
wherein K is the internal reference matrix, fx and fy are focal lengths corresponding to the image pickup devices, x0 and y0 are optical axis offsets of the image pickup devices, s is an axis skew,
fx=fy=max(x0,y0)/tan(fov/2),
where max () denotes taking the maximum value, tan () denotes taking the tangent value, and fov denotes the target angle of view.
According to one or more embodiments of the present disclosure, an image photographing device includes a camera.
According to one or more embodiments of the present disclosure, a media data processing method is applied to an electronic device, and the media data processing method includes: obtaining a three-dimensional model of the reconstructed object; mapping the three-dimensional model into a virtual three-dimensional space; the method for acquiring the parameters comprises the steps of acquiring conversion parameters corresponding to an electronic device according to the parameter acquisition method provided by any embodiment of the disclosure; and projecting the three-dimensional model from the virtual three-dimensional space to a display screen of the electronic device to display the three-dimensional model based on the corresponding conversion parameters of the electronic device.
According to one or more embodiments of the present disclosure, the media data processing method further includes: acquiring a virtual special effect model; and superposing the virtual special effect model on the three-dimensional model for displaying.
According to one or more embodiments of the present disclosure, the virtual special effect model is an augmented reality special effect model.
According to one or more embodiments of the present disclosure, a parameter obtaining apparatus includes: a memory for non-transitory storage of computer readable instructions; a processor for executing the computer readable instructions, wherein the computer readable instructions, when executed by the processor, implement the parameter obtaining method according to any embodiment of the present disclosure.
According to one or more embodiments of the present disclosure, a non-transitory computer-readable storage medium stores computer-readable instructions which, when executed by a processor, implement a parameter acquisition method provided according to any one of the embodiments of the present disclosure.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
For the present disclosure, there are also the following points to be explained:
(1) the drawings of the embodiments of the disclosure only relate to the structures related to the embodiments of the disclosure, and other structures can refer to the common design.
(2) Thicknesses and dimensions of layers or structures may be exaggerated in the drawings used to describe embodiments of the present invention for clarity. It will be understood that when an element such as a layer, film, region, or substrate is referred to as being "on" or "under" another element, it can be "directly on" or "under" the other element or intervening elements may be present.
(3) Without conflict, embodiments of the present disclosure and features of the embodiments may be combined with each other to arrive at new embodiments.
The above description is only for the specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and the scope of the present disclosure should be subject to the scope of the claims.

Claims (15)

1. A parameter acquisition method, comprising:
acquiring characteristic information of an electronic device, wherein the electronic device comprises an image shooting device;
determining a parameter acquisition rule corresponding to the electronic device according to the characteristic information of the electronic device;
and determining conversion parameters corresponding to the electronic device according to the parameter acquisition rule, wherein the conversion parameters comprise an internal reference matrix corresponding to the image shooting device.
2. The parameter acquisition method according to claim 1, wherein the characteristic information of the electronic device includes an operating system and a model of the electronic device.
3. The parameter acquisition method according to claim 2, wherein when the characteristic information of the electronic apparatus indicates that the operating system of the electronic apparatus is the first operating system,
determining a conversion parameter corresponding to the electronic device according to the parameter obtaining rule, including:
acquiring a preset field angle provided by a field angle interface of the electronic device according to the parameter acquisition rule:
in response to the fact that the preset field angle is within a set field angle range, taking the preset field angle as a target field angle according to the parameter acquisition rule; in response to the fact that the preset field angle is not within the set field angle range, taking a default field angle as a target field angle according to the parameter acquisition rule;
and determining the conversion parameter according to the target field angle.
4. The parameter acquisition method according to claim 3, wherein the set field angle range is a range of 45 degrees to 80 degrees.
5. The parameter acquisition method according to claim 3, wherein the default angle of view is calculated by counting and calculating a preset angle of view provided by angle of view interfaces of a plurality of electronic devices having the first operating system, and the default angle of view is between 63 degrees and 65 degrees.
6. The parameter acquisition method according to claim 2, wherein when the characteristic information of the electronic apparatus indicates that the operating system of the electronic apparatus is the second operating system,
determining a conversion parameter corresponding to the electronic device according to the parameter obtaining rule, including:
acquiring an internal reference matrix of the image capturing device provided by ARKit software in the electronic apparatus implementing the ARKit function as the conversion parameter according to the parameter acquisition rule in response to the electronic apparatus having the ARKit function; and
and in response to the fact that the electronic device does not have an ARKit function and the model of the electronic device falls into a preset model list, acquiring a measurement field angle corresponding to the electronic device from a field angle lookup table as a target field angle according to the parameter acquisition rule, and determining the conversion parameter according to the target field angle.
7. The parameter acquisition method according to claim 2, wherein when the feature information of the electronic device indicates that the operating system of the electronic device is a second operating system and the model of the electronic device falls into a preset model list,
determining a conversion parameter corresponding to the electronic device according to the parameter obtaining rule, including:
and acquiring a measurement field angle corresponding to the electronic device from a field angle lookup table as a target field angle according to the parameter acquisition rule, and determining the conversion parameter according to the target field angle.
8. The parameter acquisition method according to claim 6 or 7, wherein the viewing angle lookup table is constructed by measuring in advance a viewing angle of an electronic device having the second operating system and a model falling into the preset model list.
9. The parameter acquisition method according to any one of claims 3 to 7, wherein the internal reference matrix corresponding to the image capture device is represented as:
Figure FDA0002697670780000021
wherein K is the internal reference matrix, fx and fy are focal lengths corresponding to the image pickup devices, x0 and y0 are optical axis offsets of the image pickup devices, s is an axis skew,
fx=fy=max(x0,y0)/tan(fov/2),
where max () denotes taking the maximum value, tan () denotes taking the tangent value, and fov denotes the target angle of view.
10. The parameter acquisition method according to any one of claims 1 to 7, wherein the image capturing device includes a camera.
11. A media data processing method is applied to an electronic device, wherein the media data processing method comprises the following steps:
obtaining a three-dimensional model of the reconstructed object;
mapping the three-dimensional model into a virtual three-dimensional space;
the parameter acquisition method according to any one of claims 1 to 10, acquiring a conversion parameter corresponding to the electronic device;
and projecting the three-dimensional model from the virtual three-dimensional space to a display screen of the electronic device to display the three-dimensional model based on the corresponding conversion parameter of the electronic device.
12. The media data processing method of claim 11, further comprising:
acquiring a virtual special effect model;
and superposing the virtual special effect model on the three-dimensional model for displaying.
13. The media data processing method according to claim 12, wherein the virtual special effects model is an augmented reality special effects model.
14. A parameter acquisition apparatus, comprising:
a memory for non-transitory storage of computer readable instructions;
a processor for executing the computer readable instructions, the computer readable instructions when executed by the processor implementing the parameter acquisition method according to any one of claims 1 to 10.
15. A non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer readable instructions which, when executed by a processor, implement the parameter acquisition method according to any one of claims 1 to 10.
CN202011011407.6A 2020-09-23 2020-09-23 Parameter acquisition method and device, media data processing method and storage medium Active CN112132909B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011011407.6A CN112132909B (en) 2020-09-23 2020-09-23 Parameter acquisition method and device, media data processing method and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011011407.6A CN112132909B (en) 2020-09-23 2020-09-23 Parameter acquisition method and device, media data processing method and storage medium

Publications (2)

Publication Number Publication Date
CN112132909A true CN112132909A (en) 2020-12-25
CN112132909B CN112132909B (en) 2023-12-22

Family

ID=73839191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011011407.6A Active CN112132909B (en) 2020-09-23 2020-09-23 Parameter acquisition method and device, media data processing method and storage medium

Country Status (1)

Country Link
CN (1) CN112132909B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113067984A (en) * 2021-03-30 2021-07-02 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106803286A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 Mutual occlusion real-time processing method based on multi-view image
CN107607295A (en) * 2017-09-30 2018-01-19 华勤通讯技术有限公司 A kind of visual field angle measuring device and method
CN108510541A (en) * 2018-03-28 2018-09-07 联想(北京)有限公司 A kind of information adjusting method, electronic equipment and computer readable storage medium
CN110047039A (en) * 2019-02-28 2019-07-23 中国人民解放军军事科学院国防科技创新研究院 A kind of redundancy visual field full-view image construction method of Virtual reality interaction
US10402938B1 (en) * 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
CN110221690A (en) * 2019-05-13 2019-09-10 Oppo广东移动通信有限公司 Gesture interaction method and device, storage medium, communication terminal based on AR scene
CN110610523A (en) * 2018-06-15 2019-12-24 杭州海康威视数字技术股份有限公司 Automobile look-around calibration method and device and computer readable storage medium
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN111246089A (en) * 2020-01-14 2020-06-05 Oppo广东移动通信有限公司 Jitter compensation method and apparatus, electronic device, computer-readable storage medium
CN111325798A (en) * 2018-12-13 2020-06-23 浙江宇视科技有限公司 Camera model correction method and device, AR implementation equipment and readable storage medium
CN111461994A (en) * 2020-03-30 2020-07-28 苏州科达科技股份有限公司 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture
CN111508033A (en) * 2020-04-20 2020-08-07 腾讯科技(深圳)有限公司 Camera parameter determination method, image processing method, storage medium, and electronic apparatus
CN111656403A (en) * 2019-06-27 2020-09-11 深圳市大疆创新科技有限公司 Method and device for tracking target and computer storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10402938B1 (en) * 2016-03-31 2019-09-03 Gopro, Inc. Systems and methods for modifying image distortion (curvature) for viewing distance in post capture
CN106803286A (en) * 2017-01-17 2017-06-06 湖南优象科技有限公司 Mutual occlusion real-time processing method based on multi-view image
CN107607295A (en) * 2017-09-30 2018-01-19 华勤通讯技术有限公司 A kind of visual field angle measuring device and method
CN108510541A (en) * 2018-03-28 2018-09-07 联想(北京)有限公司 A kind of information adjusting method, electronic equipment and computer readable storage medium
CN110610523A (en) * 2018-06-15 2019-12-24 杭州海康威视数字技术股份有限公司 Automobile look-around calibration method and device and computer readable storage medium
CN111325798A (en) * 2018-12-13 2020-06-23 浙江宇视科技有限公司 Camera model correction method and device, AR implementation equipment and readable storage medium
CN110047039A (en) * 2019-02-28 2019-07-23 中国人民解放军军事科学院国防科技创新研究院 A kind of redundancy visual field full-view image construction method of Virtual reality interaction
CN110221690A (en) * 2019-05-13 2019-09-10 Oppo广东移动通信有限公司 Gesture interaction method and device, storage medium, communication terminal based on AR scene
CN111656403A (en) * 2019-06-27 2020-09-11 深圳市大疆创新科技有限公司 Method and device for tracking target and computer storage medium
CN110675348A (en) * 2019-09-30 2020-01-10 杭州栖金科技有限公司 Augmented reality image display method and device and image processing equipment
CN111246089A (en) * 2020-01-14 2020-06-05 Oppo广东移动通信有限公司 Jitter compensation method and apparatus, electronic device, computer-readable storage medium
CN111461994A (en) * 2020-03-30 2020-07-28 苏州科达科技股份有限公司 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture
CN111508033A (en) * 2020-04-20 2020-08-07 腾讯科技(深圳)有限公司 Camera parameter determination method, image processing method, storage medium, and electronic apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113067984A (en) * 2021-03-30 2021-07-02 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment
CN113067984B (en) * 2021-03-30 2023-01-17 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment

Also Published As

Publication number Publication date
CN112132909B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN109272454B (en) Coordinate system calibration method and device of augmented reality equipment
CN111127563A (en) Combined calibration method and device, electronic equipment and storage medium
CN112333491B (en) Video processing method, display device and storage medium
CN112288853B (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, and storage medium
CN112488783B (en) Image acquisition method and device and electronic equipment
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
WO2018214778A1 (en) Method and device for presenting virtual object
CN112907652B (en) Camera pose acquisition method, video processing method, display device, and storage medium
CN109801354B (en) Panorama processing method and device
CN111862352A (en) Positioning model optimization method, positioning method and positioning equipment
CN113793387A (en) Calibration method, device and terminal of monocular speckle structured light system
CN115984371A (en) Scanning head posture detection method, device, equipment and medium
CN111325792A (en) Method, apparatus, device, and medium for determining camera pose
CN112132909B (en) Parameter acquisition method and device, media data processing method and storage medium
CN111818265B (en) Interaction method and device based on augmented reality model, electronic equipment and medium
CN107657663B (en) Method and device for displaying information
CN115170395A (en) Panoramic image stitching method, panoramic image stitching device, electronic equipment, panoramic image stitching medium and program product
CN114529452A (en) Method and device for displaying image and electronic equipment
CN114049403A (en) Multi-angle three-dimensional face reconstruction method and device and storage medium
CN112887793A (en) Video processing method, display device, and storage medium
CN115908679A (en) Texture mapping method, device, equipment and storage medium
CN113066166A (en) Image processing method and device and electronic equipment
CN115937299B (en) Method for placing virtual object in video and related equipment
CN112837424B (en) Image processing method, apparatus, device and computer readable storage medium
CN112883757B (en) Method for generating tracking attitude result

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant