CN111986333B - Image generation method, device, terminal and storage medium based on augmented reality - Google Patents

Image generation method, device, terminal and storage medium based on augmented reality Download PDF

Info

Publication number
CN111986333B
CN111986333B CN202010906480.3A CN202010906480A CN111986333B CN 111986333 B CN111986333 B CN 111986333B CN 202010906480 A CN202010906480 A CN 202010906480A CN 111986333 B CN111986333 B CN 111986333B
Authority
CN
China
Prior art keywords
image
positioning
mobile terminal
augmented reality
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010906480.3A
Other languages
Chinese (zh)
Other versions
CN111986333A (en
Inventor
陈彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010906480.3A priority Critical patent/CN111986333B/en
Publication of CN111986333A publication Critical patent/CN111986333A/en
Application granted granted Critical
Publication of CN111986333B publication Critical patent/CN111986333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses an image generation method, device, terminal and storage medium based on augmented reality, and belongs to the technical field of computers. The application provides a platform device, which can enable the platform device to independently acquire a positioning image of a shot object, acquire spatial position information of the shot object according to the positioning image, combine an appearance image of the shot object acquired by the platform device, comprehensively process the appearance image to acquire standardized data, transmit the standardized data to a mobile terminal, and enable the mobile terminal to render and display an enhanced display image. Because the platform equipment can acquire the spatial position information and the appearance image of the shot object at the same time, the mobile terminal is not required to acquire the corresponding spatial position information, so that the enhanced display application only needs to process standardized data when in development, the development difficulty of the enhanced application is reduced, the mobile terminal is not required to acquire the spatial position information, and the popularization difficulty of the enhanced display application in mobile terminals of different models is reduced.

Description

Image generation method, device, terminal and storage medium based on augmented reality
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to an image generation method, device, terminal and storage medium based on augmented reality.
Background
AR (augmented reality ) technology is a technology that fuses virtual information with the real world for display. AR technology relates to multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction and sensing, and the like.
In the related art, an apparatus integrating the AR technology firstly performs data acquisition on a real scene through a camera in the AR head display apparatus, then transmits the data to a processor for analysis and reconstruction, and then displays the data through the AR head display apparatus.
Disclosure of Invention
The embodiment of the application provides an image generation method, device, terminal and storage medium based on augmented reality. The technical scheme is as follows:
According to an aspect of the present application, there is provided an image generation method based on augmented reality, applied to a platform device, the method comprising:
acquiring at least two positioning images comprising a shot object;
Acquiring the spatial position information of the shot object according to at least two positioning images;
Obtaining an appearance image of the shot object;
carrying out standardization processing on the spatial position information and the appearance image to obtain standardized data with association relation;
And transmitting the standardized data to a mobile terminal, wherein the standardized data is used for being displayed as an augmented reality image after being rendered in the mobile terminal, and the augmented reality image is an image formed by fusing the appearance image with the virtual image.
According to another aspect of the present application, there is provided an image generating apparatus based on augmented reality, applied to a platform device, the apparatus comprising:
The first acquisition module is used for acquiring at least two positioning images comprising a shot object;
the information calculation module is used for calculating the spatial position information of the shot object according to at least two positioning images;
the second acquisition module is used for acquiring the appearance image of the shot object;
the standardized module is used for carrying out standardized processing on the spatial position information and the appearance image to obtain standardized data with association relation;
And the data transmission module is used for transmitting the standardized data to the mobile terminal, wherein the standardized data is used for being rendered in the mobile terminal and then displayed as an augmented reality image, and the augmented reality image is an image formed by fusing the appearance image with the virtual image.
According to another aspect of the present application, there is provided a terminal comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement an augmented reality based image generation method as provided by the various aspects of the present application.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement an augmented reality based image generation method as provided by the various aspects of the present application.
According to one aspect of the present application, a computer program product is provided that includes computer instructions stored in a computer readable storage medium. The computer instructions are read from a computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the methods provided in various alternative implementations of the augmented reality-based image generation aspect described above.
The application provides a platform device, which can enable the platform device to independently acquire a positioning image of a shot object, acquire spatial position information of the shot object according to the positioning image, combine an appearance image of the shot object acquired by the platform device, comprehensively process the appearance image to acquire standardized data, transmit the standardized data to a mobile terminal, and enable the mobile terminal to render and display an enhanced display image. Because the platform equipment can acquire the spatial position information and the appearance image of the shot object at the same time, the mobile terminal is not required to acquire the corresponding spatial position information, so that the enhanced display application only needs to process standardized data when in development, the development difficulty of the enhanced application is reduced, the mobile terminal is not required to acquire the spatial position information, and the popularization difficulty of the enhanced display application in mobile terminals of different models is reduced.
Drawings
In order to more clearly describe the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments of the present application will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
Fig. 1 is a block diagram of an image display apparatus provided in an exemplary embodiment of the present application;
FIG. 2 is a schematic front view of a platform device according to an embodiment of the present application;
FIG. 3 is a schematic back view of a platform device according to an embodiment of the present application;
FIG. 4 is a schematic rear view of a platform device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of data interaction between a platform device and an image display device according to an embodiment of the present application;
FIG. 6 is a flowchart of an image generation method based on augmented reality according to an exemplary embodiment of the present application;
FIG. 7 is a flowchart of an image generation method based on augmented reality according to another exemplary embodiment of the present application;
fig. 8 is a block diagram illustrating a structure of an image generating apparatus based on augmented reality according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "connected," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art. Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
In order that the scheme shown in the embodiment of the present application can be easily understood, several terms appearing in the embodiment of the present application are described below.
Positioning a camera: for acquiring a positioning image. Wherein the positioning image is used in a group of at least two when used for calculating the spatial position information. For example, a set of positioning images includes two images, P1 and P2. Or a set of positioning images including three images P1, P2 and P3. It should be noted that the number of sheets in a set of positioning images may be odd or even, which is not limited in the embodiment of the present application.
Alternatively, the positioning image may not include an image normally visible to the user, and the positioning image may include only reference pixels for measuring depth of field. In this example, the positioning image is used to calculate spatial location information in the platform device without presentation to the user for viewing. It should be noted that, the positioning camera is a camera installed in the platform device, and when the positioning image is collected, the positioning camera needs to be collected at the same time. Optionally, a positioning camera captures a positioning image at the target moment.
Image acquisition camera: for collecting appearance information of a photographed object. Illustratively, the image capture camera may be an RGB (Red Green Blue) camera.
In a possible implementation, the image acquisition camera and the positioning camera arranged in the platform device are different cameras. The appearance image of the shot object is an image acquired by an image acquisition camera.
In another possible implementation, no image acquisition camera is provided in the platform device. The appearance image of the subject is an image obtained by capturing with a positioning camera. In this scene, the positioning camera has the function of an RGB camera.
And (3) standardization treatment: and sorting the data according to a preset format so that the image data or the spatial position information accords with the preset format. The design of the preset format can meet any effect of easy storage, convenient transmission or difficult leakage, and the embodiment of the application is not limited to the above.
Data transmission interface: the platform device is used for connecting the platform device and the image display device, so that data interaction is carried out between the platform device and the image display device. In one aspect, the data transmission interface is configured to transmit data in the platform device to the image display device. In another aspect, the data transmission interface is configured to transmit data in the image display device to the platform device. It should be noted that the above two functions may be arranged according to the requirements of the platform device when actually implemented, for example, the platform device may only be able to send data to the platform device through the data transmission interface to the image display device. Or the platform device may include the functionality of both aspects.
In one possible implementation, the data transmission interface may be implemented as a wired interface. The wired interface may be implemented as a mere interface for transmitting data. The wired interface may also be implemented as a purely electrical power transfer interface. The wired interface can also be implemented as an interface that can transmit both data and power.
When the wired interface is an interface capable of transmitting both data and power, the interface may include a plurality of contacts. Wherein the designated contact is used for transmitting data, and the other part of the contact is used for transmitting electric energy.
In another possible implementation, the data transmission interface may also be implemented as a wireless air interface.
The image generating method based on augmented reality, which is shown in the embodiment of the application, can be applied to a platform device, and the platform device can be a game handle with a groove on the front surface or other types of handles. The other surface of the platform equipment provided with the groove is provided with two positioning cameras, at least one image acquisition camera and the like. In the platform device, a processor is also provided inside. The processor of the platform equipment can receive the positioning image acquired by the positioning camera through the bus, and meanwhile, the appearance image of the shot object acquired by the image acquisition camera.
Referring to fig. 1, fig. 1 is a block diagram of a platform device according to an exemplary embodiment of the present application, and as shown in fig. 1, the image display device 100 includes a processor 120 and a memory 140, where at least one instruction is stored in the memory 140, and the instruction is loaded and executed by the processor 120 to implement an image generating method based on augmented reality according to various method embodiments of the present application.
Processor 120 may include one or more processing cores. The processor 120 connects various parts within the overall terminal 100 using various interfaces and lines, performs various functions of the terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 140, and invoking data stored in the memory 140. Alternatively, the processor 120 may be implemented in at least one hardware form of digital signal Processing (DIGITAL SIGNAL Processing, DSP), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 120 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 120 and may be implemented by a single chip.
The Memory 140 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (ROM). Optionally, the memory 140 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 140 may be used to store instructions, programs, code sets, or instruction sets. The memory 140 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, etc.; the storage data area may store data and the like referred to in the following respective method embodiments.
In one possible implementation, the mobile terminal may be a smart phone or other device having a processor and memory. Alternatively, the mobile terminal may be placed in a recess provided in the platform device. Illustratively, the mobile terminal is provided with a spring catch, a magnetic catch or other form of catch in the recess such that the image display device is firmly secured in the recess of the platform device.
Referring to fig. 2, fig. 2 is a schematic front view of a platform apparatus according to an embodiment of the application. In fig. 2, the platform device 200 includes a recess 210 and a function key 220, and the front surface may be a first side of the platform device. It should be noted that, when the data transmission interface is a wired interface, the wired interface 230 may be disposed on an inner wall of the groove.
In one possible approach, the wired interface 230 is disposed at a designated location on the inner wall of the recess 210. Illustratively, a mobile terminal may be disposed in the recess.
In another possible way, the position of the wired interface 230 on the inner wall of the recess 210 is flexibly adjustable. In this design, the wired interface 230 is connected to a bus inside the platform device by a metal rail surrounding the inner wall of the recess 210.
When the data transmission interface is a wireless interface, the antenna of the wireless interface may be disposed inside the housing in which the groove is located, or the antenna of the wireless interface may be disposed inside the housing of the inner wall.
Referring to fig. 3, fig. 3 is a schematic back view of a platform apparatus according to an embodiment of the application. In fig. 3, the platform apparatus 200 includes a positioning camera 241 and a positioning camera 242, and an image capturing camera 250.
The minimum distance between the positioning camera 241 and the positioning camera 242 is a distance threshold. The distance threshold is data that is capable of satisfying the determination of spatial location information by the platform device. For example, the distance threshold may be a specified value of 5 cm, 6 cm, 7 cm, or the like. If the distance between the two positioning cameras is smaller than the distance threshold, the platform equipment cannot accurately measure the spatial position information.
On the other hand, the maximum line segment length of the plane where the two positioning cameras are arranged is the diagonal line length of the rectangular area. Thus, in this example, the upper limit of the distance threshold is set to the length of the diagonal line of the surface of the platform device where the positioning camera is disposed.
In one possible way, the image capturing camera 250 is arranged between the positioning camera 241 and the positioning camera 242, and the image capturing camera 250 is provided with a position of the center line of the platform device on the back, which may also be referred to as the second side.
Referring to fig. 4, fig. 4 is a schematic back view of a platform apparatus according to an embodiment of the application. In fig. 4, the platform apparatus 200 includes a positioning camera 241 and a positioning camera 242, and an image capturing camera 250. In this embodiment, the positioning camera 241 and the positioning camera 242 are formed in two corners of a diagonal line. In addition, the holding area 261 and the holding area 262 are also designed in the platform device, so that when the user holds the platform device 200, the user does not hold the positioning camera to cause shielding.
Referring to fig. 5, fig. 5 is a schematic diagram of data interaction between a platform device and a mobile terminal according to an embodiment of the present application. In fig. 5, data interaction is performed between the platform device 200 and the mobile terminal 100 through a data transmission interface.
In one aspect, for platform device 200, the platform device has a positioning camera, an image capture camera, function keys (for interaction), data processing capabilities, and other functions. Other functions may include a vibration function, a sound-producing function, a flashing function, or the like, among others.
On the other hand, for the mobile terminal 100, the mobile terminal has an AR system platform, an image display function, a virtual object generation function, a data processing function, and a man-machine interaction function.
Referring to fig. 6, fig. 6 is a flowchart of an image generating method based on augmented reality according to an exemplary embodiment of the present application. The image generation method based on augmented reality can be applied to the platform equipment shown in the above, the platform equipment further comprises a processor and a data transmission interface, the platform equipment is provided with a groove on a first side, the groove is used for accommodating the mobile terminal, and the second side is provided with two positioning cameras and at least one image acquisition camera. In fig. 6, the augmented reality-based image generation method includes:
At step 610, at least two positioning images including a subject are acquired.
In the embodiment of the application, the platform equipment can acquire at least two positioning images comprising the shot object. In one image acquisition method, a positioning image for acquiring spatial position information of the same subject is acquired at the same time. For example, when the platform device is at time a, the platform device can acquire the positioning image P3 through the first positioning camera and can acquire the positioning image P4 through the second positioning camera. Wherein the positioning image P3 and the positioning image P4 are the same set of positioning images for determining the subject.
Step 620, acquiring spatial position information of the photographed object according to at least two positioning images.
In the embodiment of the application, the platform equipment can acquire at least two positioning images through the two positioning cameras, and process the at least two images to obtain the spatial position information of the shot object. In one possible way, a binocular vision algorithm is built into the platform device. The binocular vision algorithm can be used for processing according to the steps given by the algorithm after the images shot by the two positioning cameras are acquired, and finally, the spatial position information of the shot object is obtained.
It should be noted that, the method based on the binocular vision algorithm may be to perform positioning calibration for each pixel in the photographed object. For example, for a specified pixel point, its coordinate point in the first positioning image is (x 1, y 1), and its coordinate point in the second positioning image is (x 2, y 2). The depth of the pixel point can be calculated according to the two coordinate points and the pre-calibrated related data based on the binocular vision algorithm. In addition, the position of the pixel point in the horizontal direction and the position of the pixel point in the vertical direction are combined to obtain the spatial position information of the pixel point in the space.
In one possible expression, the spatial location information will be represented in the form of coordinates.
In step 630, an appearance image of the subject is acquired.
In the embodiment of the application, the platform equipment can acquire the appearance image of the shot object in real time through the image acquisition camera. The object to be photographed may be a person, an animal, a plant, an article, a building, or the like.
And 640, performing standardization processing on the spatial position information and the appearance image to obtain standardized data with association relation.
In the embodiment of the application, after the platform equipment acquires the spatial position information and the appearance image, the platform equipment can process the information to obtain the processed standardized processing. The normalization may include filtering or otherwise processing the data. In one possible approach, the normalization process is aimed at least one of reducing latency, reducing the amount of data, or improving imaging quality when transmitting to the mobile terminal.
By way of example, the standardized processing may be a method of curing in hardware of the platform device. In another manner, the manner of normalizing the processor may also be implemented by a software process flow set in the platform device.
In step 650, the standardized data is transmitted to the mobile terminal, where the standardized data is used for being rendered in the mobile terminal and displayed as an augmented reality image, and the augmented reality image is an image formed by fusing the appearance image and the virtual image.
In an embodiment of the application, the standardized data will be sent to the mobile terminal. The mobile terminal may be another terminal installed in the platform device. When the platform equipment comprises a storage groove, the mobile terminal can be clamped in the storage groove. The mobile terminal and the platform equipment can communicate in a wired mode or a wireless mode.
The communication channel between the mobile terminal and the platform device may support both wired communication and wireless communication. The user can choose by the convenience of use and the requirement of the transmission rate. For example, in one possible implementation, the wired communication may occupy an external charging port of the mobile terminal, which affects charging of the mobile terminal. In this scenario, the mobile terminal may select a manner of wireless communication to transmit data. In another possible implementation, the transmission rate of the wireless communication is lower than the rate of the wired communication. If the user requires a higher rate of communication, the user may select the manner of wireless communication as the manner of communication between the platform device and the mobile terminal.
Illustratively, in one possible data transmission mode, the platform device transmits standardized data to the mobile terminal through the data transmission interface. Wherein the standardized data is used for an image displayed in an augmented reality form after being rendered in the mobile terminal.
In a mobile terminal, a third party application may be run. The third party application can store a corresponding AR system platform, and the AR system platform can generate a virtual object and combine standardized data to synthesize and display images. For example, the AR system platform in the mobile terminal can obtain, according to the standardized data, the positions of the pixels in the space to be displayed. On the basis, the AR system platform can display the generated three-dimensional virtual object on the shot object in a superposition mode to form a new augmented reality image. It should be noted that, with the operation of the user, the form of the three-dimensional virtual object may be changed accordingly, so that the enhanced display image is changed.
For example, a virtual pet superimposed on a real table surface has a large body size after a user feeds a virtual food by operating the virtual pet.
In summary, the image generating method based on augmented reality provided in this embodiment may be applied to a platform device, where the platform device may obtain at least two positioning images including a photographed object, and obtain spatial position information of the photographed object according to the at least two positioning images, and then the platform device may obtain an appearance image of the photographed object, perform standardization processing on the spatial position information and the appearance image, and transmit the standardized data to a mobile terminal, where the standardized data is rendered in the mobile terminal and then displayed as an augmented reality image. Because the platform equipment can acquire the spatial position information and the appearance image of the shot object at the same time, the mobile terminal is not required to acquire the corresponding spatial position information, so that the enhanced display application only needs to process standardized data when in development, the development difficulty of the enhanced application is reduced, the mobile terminal is not required to acquire the spatial position information, and the popularization difficulty of the enhanced display application in mobile terminals of different models is reduced.
Based on the solution disclosed in the previous embodiment, the platform device can also implement the image generation method based on augmented reality in other ways, please refer to the following embodiments.
Referring to fig. 7, fig. 7 is a flowchart of an image generating method based on augmented reality according to another exemplary embodiment of the present application. The augmented reality-based image generation method may be applied to the terminal shown above. In fig. 7, the augmented reality-based image generation method includes:
And 711, respectively acquiring at least one positioning image through the two positioning cameras.
In the embodiment of the application, the platform equipment comprises two positioning cameras, and the distance between the two positioning cameras is greater than or equal to a distance threshold value and less than or equal to the length of the diagonal line of the surface of the platform equipment, on which the positioning cameras are arranged.
Step 712, acquiring spatial position information of the photographed object according to the at least two positioning images.
The platform device may be provided with a recess on the front side and two positioning cameras on the back side, for example. The user is able to place the mobile terminal in the recess of the platform device and establish data communication between the platform device and the mobile terminal. The platform device can respectively acquire at least one positioning image through the two positioning cameras at the same time after a user selects a shot object or automatically selects the shot object by focusing. In one possible implementation scenario, a first positioning camera captures a positioning image and a second positioning camera captures a positioning image at the same time. The two positioning images are used for calculating depth information of a shot object.
In another possible scenario, the platform device can acquire at least two positioning images by two positioning cameras, respectively. For example, at a first moment, the first positioning camera acquires the positioning image P11, and the second positioning camera acquires the positioning image P12. At the second moment, the first positioning camera acquires the positioning image P21, and the second positioning camera acquires the positioning image P22. In this scenario, the platform device may preliminarily determine the spatial position information of the target pixel point using the positioning image P11 and the positioning image P12. The stage device may also calibrate the spatial position information of the target pixel point using the positioning image P21 and the positioning image P22. Specifically, if the relative distance between the first spatial position information calculated from the positioning image P11 and the positioning image P12 and the second spatial position information calculated from the positioning image P21 and the positioning image P22 is greater than the error threshold, the average value between the first spatial position information and the second spatial position information is used as the spatial position information of the target pixel point. And if the relative distance between the first spatial position information and the second spatial position information is smaller than or equal to an error threshold value, continuing to use the first control position information as the spatial position information of the target pixel point.
In the present application, since the positioning camera is provided in the stage apparatus. Therefore, the processing speed of acquiring the positioning picture and the spatial position information through the platform equipment is high, the method can adapt to multiple times of calibration for one pixel point, and the capability of acquiring the accurate spatial position information of each pixel point is improved.
In step 713, an appearance image of the subject is acquired.
In the embodiment of the application, the platform equipment can acquire the appearance image of the shot object. The augmented reality image finally generated by the mobile terminal needs to be presented to the user for viewing. Therefore, the appearance image photographed by the platform device needs to include color information of the photographed object, related optical characteristics, and the like.
In one possible implementation, a camera is provided in the platform device that specifically captures an appearance image of the subject. The image acquired by the camera is used for reflecting the appearance image of the shot object. In the implementation mode, the platform equipment can realize that the special component executes special functions, and complicated design on software control due to the fact that the functions of the single component are too complex is avoided. In this design, the platform device may acquire the positioning image and the appearance image of the subject at the same time at a specified timing.
In another possible implementation, the camera in the platform device that collects the positioning image and the external light image of the object to be photographed is the same camera. In the processing flow, the platform equipment can instruct the camera to continuously acquire the positioning image and the appearance image of the shot object in a shorter time period, so that the situation that the positioning image and the appearance image are not matched due to the displacement of the platform equipment is avoided.
Step 714, the spatial position information and the appearance image are normalized, so as to obtain normalized data with association relation.
In the embodiment of the application, the platform equipment can perform standardized processing on the spatial position information and the external light image according to the preset image processing mode.
As one possible implementation, the platform device is capable of providing a plurality of different purpose image normalization modes. For example, the platform device processes multiple image normalization modes in accordance with imaging capabilities. Among them, the image normalization mode includes a high fidelity mode and a fluency display mode. In the high-fidelity mode, the platform device determines the spatial position of each pixel of the photographed object, so that each detail in the finally presented enhanced display image has the depth of field of the platform device, and a finer stereoscopic impression is presented. In the fluent display mode, the platform device samples each pixel of the shot object in a larger distance, so that the speed of generating standardized data can be greatly improved, and the finally presented augmented reality image frame number is higher and smoother.
In another possible implementation, the platform device is also capable of providing parameters of the appearance image of the plurality of photographed objects. In a possible mode, the camera of the platform equipment for shooting the appearance image of the shot object adopts a periscope type lens module, so that the platform equipment realizes longer focal length adjusting capability through transverse optical refraction on the premise of fixed thickness.
In another possible implementation, the spatial location information and the corresponding appearance image are processed to obtain a data packet or a group of data packets. In this packet, or in a group of packets, the mobile terminal identifies it as an object of the whole.
In step 715, in response to the connection request of the mobile terminal, a corresponding data transmission protocol is obtained from the connection request.
In the embodiment of the application, when the platform equipment is placed in the mobile terminal in the groove, the connection request between the mobile terminal and the platform equipment can be responded. In one possible way, the connection request is sent by means of a wireless signal. In another possible manner, the connection request is transmitted through a cable, which is not limited by the embodiment of the present application.
It should be noted that, since the platform device is designed, it is capable of connecting to a wide variety of mobile terminals. Thus, when the platform device is designed, different data transmission protocols can be designed for different types of mobile terminals. Illustratively, the data transfer protocol is sent by the mobile terminal to the platform device. In another possible manner, the data transmission protocol is downloaded from the cloud by the platform device via a third party application in the mobile terminal.
As a possible implementation, the data transmission protocol may be any one of Bluetooth (Bluetooth) protocol, wireless fidelity protocol, or ZigBee (ZigBee) protocol.
Step 716, transmitting the standard data to the mobile terminal according to the data transmission protocol.
The connection request of the mobile terminal is a connection request generated in a third party application, and the third party application is an augmented reality application with standard data processing function.
In one possible way, data transmission is performed between the mobile terminal and the platform device via bluetooth. The platform device transmits the standard data to the mobile terminal through the data connection, and a third party application in the mobile terminal receives the standard data and decodes and processes the standard data.
It should be noted that, the third party application may be designed for standard data during development, so that the third party application has the capability of decoding standard data, so that the standard data can be directly processed after the platform device establishes a data connection.
Optionally, the third party application is an application for generating an enhanced display image from standard data. In some application scenarios, the third party application may be one of a gaming application, a social application, a short video application, a beauty application, a camera application, a travel application, a map application, a travel application, a teaching application, or an office application. It should be noted that the foregoing description of the third party application is merely illustrative, and does not limit the types of possible implementation of the third party application.
Step 717, in response to the first key command acting on the function key, receives corresponding key information.
In the embodiment of the application, the platform equipment can be further provided with a function key which can be pressed by a user to operate. When the function key is pressed to trigger a first key instruction, the platform equipment receives corresponding key information. Wherein, different keys correspond to different key information. The key information may include a press identifier and a lift identifier.
In a key comprising a pressure sensor, the key information may also comprise a pressure value.
At step 718, standardized control instructions are generated by the processor.
In the platform device, the processor can convert case information transmitted by the function keys into standardized control instructions.
Step 719, sending the standardized control instruction to the mobile terminal through the data transmission interface.
It should be noted that, the third party application in the mobile terminal is configured to receive the standardized control instruction. Since the third party application sets the standardized control instructions and the corresponding conditions of the respective operations in the development stage, the third party application can rapidly execute the respective operations. In one possible implementation, the third party application may place standardized control instructions in memory corresponding to respective operations so that the third party application may quickly perform the respective operations.
In step 720, in response to the second key command acting on the function key, the second key command is identified by the processor.
In the embodiment of the application, the first key instruction and the second key instruction are different key instructions. The platform device is capable of recognizing the second key instruction through the processor. Accordingly, after the second key command is identified, the processor can determine the actual type of the second key command.
In step 721, in response to the second key instruction being a debug instruction, a debug mode is entered.
In this example, the debug mode is entered when the processor in the platform device recognizes that the second key instruction is a debug instruction. It should be noted that, in the debug mode, some important parameters in the platform device can be modified.
In step 722, in debug mode, setup data is received, the setup data being used to indicate parameters for acquiring the appearance image and/or parameters for acquiring spatial location information.
In the embodiment of the application, the platform equipment can provide the debugging function through the function keys, so that the AR application developer can conveniently test through the platform equipment.
It should be noted that, the data transmission interface of the platform device is a wired interface, and the wired interface is used for transmitting data and/or transmitting electric energy.
In the embodiment of the application, the debugging mode can be entered through the function keys. Therefore, the platform device can receive the setting data, so that a developer of the third party application can conveniently develop the third party application with good performance by debugging the proper setting data.
In one possible application, the mobile terminal is placed and fixed in a recess of the platform device. The platform equipment obtains space position information through the positioning camera and obtains an appearance image through the image acquisition camera. The processor in the platform device can process the information into standardized data, and then transmit the standardized data to the mobile terminal, and the mobile terminal generates a virtual part according to an algorithm in third party application and displays the virtual part together with the original appearance image.
In summary, the embodiment can realize the floor application of the AR technology through one platform device and one mobile terminal, so that the problem of blocked AR application popularization caused by the difference of cameras in different mobile terminals is solved, the image data required by the AR technology is standardized, and the difficulty of AR application popularization is reduced.
Optionally, the image generating method based on augmented reality provided by the embodiment can also miniaturize the AR device, and provides a platform device with wide support for AR technology, so that application developers can write AR related applications according to the same standard.
The following are examples of the apparatus of the present application that may be used to perform the method embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the method of the present application.
Referring to fig. 8, fig. 8 is a block diagram illustrating an image generating apparatus based on augmented reality according to an exemplary embodiment of the present application. The image generating apparatus based on augmented reality can be implemented as a whole or a part of a terminal by software, hardware or a combination of the two, and is applied to a platform device, and the apparatus comprises:
a first acquiring module 810, configured to acquire at least two positioning images including a photographed object;
An information calculating module 820, configured to calculate spatial position information of the photographed object according to at least two of the positioning images;
A second obtaining module 830, configured to obtain an appearance image of the photographed object;
the normalization module 840 is configured to perform normalization processing on the spatial location information and the appearance image, to obtain normalized data with an association relationship;
the data transmission module 850 is configured to transmit the standardized data to a mobile terminal, where the standardized data is used for being displayed as an augmented reality image after being rendered in the mobile terminal, and the augmented reality image is an image formed by fusing the appearance image with a virtual image.
In an alternative embodiment, the first acquiring module 810 is configured to acquire at least one positioning image by using two positioning cameras respectively; the platform equipment comprises two positioning cameras, wherein the distance between the two positioning cameras is greater than or equal to a distance threshold value and is smaller than or equal to the length of the diagonal line of the surface of the platform equipment, on which the positioning cameras are arranged.
In an alternative embodiment, the data transmission module 850 is configured to, in response to a connection request of the mobile terminal, obtain a corresponding data transmission protocol from the connection request; and transmitting the standard data to the mobile terminal according to the data transmission protocol.
In an alternative embodiment, the third party application to which the apparatus relates is an application for generating an augmented reality image from the standard data.
In an alternative embodiment, the platform device related to the apparatus is further provided with a function key, and the platform device comprises an execution module, a control module and a control module, wherein the execution module is used for responding to a first key instruction acted on the function key to generate a standardized control instruction; and sending the standardized control instruction to the mobile terminal through the data transmission interface.
In an alternative embodiment, the execution module is configured to respond to a second key instruction acting on the function key, and identify, by the processor, the second key instruction; responding to the second key instruction as a debugging instruction, and entering a debugging mode; in the debug mode, setting data for indicating parameters for acquiring the appearance image and/or parameters for acquiring the spatial position information are received.
In an alternative embodiment, the data transmission interface to which the apparatus relates is a wired interface for transmitting data and/or transmitting power.
In summary, the embodiment can realize the floor application of the AR technology through one platform device and one mobile terminal, so that the problem of blocked AR application popularization caused by the difference of cameras in different mobile terminals is solved, the image data required by the AR technology is standardized, and the difficulty of AR application popularization is reduced.
Optionally, the image generating method based on augmented reality provided by the embodiment can also miniaturize the AR device, and provides a platform device with wide support for AR technology, so that application developers can write AR related applications according to the same standard.
Embodiments of the present application also provide a computer readable medium storing at least one instruction that is loaded and executed by the processor to implement the augmented reality based image generation method of the various embodiments described above.
It should be noted that: the image generating apparatus based on augmented reality provided in the above embodiment is only exemplified by the division of the above functional modules when executing the image generating method based on augmented reality, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the image generating device based on augmented reality provided in the foregoing embodiment and the image generating method embodiment based on augmented reality belong to the same concept, and detailed implementation processes of the image generating device based on augmented reality are detailed in the method embodiment, and are not repeated here.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above embodiments are merely exemplary embodiments of the present application and are not intended to limit the present application, and any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principles of the present application should be included in the scope of the present application.

Claims (9)

1. The image generation method based on augmented reality is characterized by being applied to platform equipment, wherein the platform equipment is a handle provided with a groove, the groove is used for placing a mobile terminal, the other surface of the platform equipment, which is provided with the groove, is provided with two positioning cameras and at least one image acquisition camera, the image acquisition camera is arranged between the two positioning cameras, the distance between the two positioning cameras is greater than or equal to a distance threshold value and less than or equal to the length of a diagonal line of the surface of the platform equipment, on which the positioning cameras are arranged, the method comprises the following steps:
respectively acquiring at least one positioning image through the two positioning cameras;
acquiring spatial position information of a shot object according to at least two positioning images;
Obtaining an appearance image of the shot object;
carrying out standardization processing on the spatial position information and the appearance image to obtain standardized data with association relation;
and transmitting the standardized data to the mobile terminal, wherein the standardized data is used for being displayed as an augmented reality image after being rendered in the mobile terminal, and the augmented reality image is an image formed by fusing the appearance image with the virtual image.
2. The method of claim 1, wherein said transmitting said standardized data to said mobile terminal comprises:
responding to a connection request of the mobile terminal, and acquiring a corresponding data transmission protocol from the connection request;
and transmitting the standardized data to the mobile terminal according to the data transmission protocol.
3. The method of claim 2, wherein a third party application is an application for generating an augmented reality image from the standardized data.
4. The method of claim 1, wherein in the platform device, a function key is further provided, the method further comprising:
Generating a standardized control instruction in response to a first key instruction acting on the function key;
and sending the standardized control instruction to the mobile terminal through a data transmission interface.
5. The method according to claim 4, wherein the method further comprises:
identifying, by a processor, a second key instruction acting on the function key in response to the second key instruction;
responding to the second key instruction as a debugging instruction, and entering a debugging mode;
in the debug mode, setting data for indicating parameters for acquiring the appearance image and/or parameters for acquiring the spatial position information are received.
6. Method according to claim 4 or 5, characterized in that the data transmission interface is a wired interface for transmitting data and/or transmitting electrical energy.
7. The utility model provides an image generation device based on augmented reality, its characterized in that is applied to among the platform equipment, the platform equipment is the handle that is provided with the recess, the recess is used for placing mobile terminal the platform equipment is provided with the other face of recess is provided with two positioning cameras and at least one image acquisition camera, the image acquisition camera sets up between two positioning cameras, the distance between two positioning cameras is greater than or equal to the distance threshold value, is less than or equal to the length of the diagonal of the setting of platform equipment the surface that positioning cameras are located, the device includes:
the first acquisition module is used for respectively acquiring at least one positioning image through the two positioning cameras;
the information calculation module is used for calculating the spatial position information of the shot object according to at least two positioning images;
the second acquisition module is used for acquiring the appearance image of the shot object;
the standardized module is used for carrying out standardized processing on the spatial position information and the appearance image to obtain standardized data with association relation;
And the data transmission module is used for transmitting the standardized data to the mobile terminal, wherein the standardized data is used for being rendered in the mobile terminal and then displayed as an augmented reality image, and the augmented reality image is an image formed by fusing the appearance image with the virtual image.
8. A platform device, characterized in that it comprises a processor for performing the augmented reality based image generation method according to any one of claims 1 to 6;
The platform equipment is provided with a handle with a groove, the groove is used for placing a mobile terminal, the platform equipment is provided with the other face of the groove, two positioning cameras and at least one image acquisition camera are arranged, the image acquisition camera is arranged between the two positioning cameras, and the distance between the two positioning cameras is greater than or equal to a distance threshold value and less than or equal to the length of the diagonal line of the face of the platform equipment where the positioning cameras are arranged.
9. A computer readable storage medium having stored therein program instructions, which when executed by a processor, implement the augmented reality based image generation method of any one of claims 1 to 6.
CN202010906480.3A 2020-09-01 2020-09-01 Image generation method, device, terminal and storage medium based on augmented reality Active CN111986333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010906480.3A CN111986333B (en) 2020-09-01 2020-09-01 Image generation method, device, terminal and storage medium based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010906480.3A CN111986333B (en) 2020-09-01 2020-09-01 Image generation method, device, terminal and storage medium based on augmented reality

Publications (2)

Publication Number Publication Date
CN111986333A CN111986333A (en) 2020-11-24
CN111986333B true CN111986333B (en) 2024-05-03

Family

ID=73447366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010906480.3A Active CN111986333B (en) 2020-09-01 2020-09-01 Image generation method, device, terminal and storage medium based on augmented reality

Country Status (1)

Country Link
CN (1) CN111986333B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150114058A (en) * 2014-03-31 2015-10-12 홍익대학교세종캠퍼스산학협력단 On/off line 3D printing robot battle game system invoked augmented reality and Robot battle game method using the same
CN107222529A (en) * 2017-05-22 2017-09-29 北京邮电大学 Augmented reality processing method, WEB modules, terminal and cloud server
CN208049372U (en) * 2018-04-16 2018-11-06 上海触展智能设备有限公司 A kind of AR double screens all-in-one machine
CN109567309A (en) * 2018-12-06 2019-04-05 广景视睿科技(深圳)有限公司 A kind of air navigation aid and intelligent shoe based on augmented reality
WO2019179331A1 (en) * 2018-03-22 2019-09-26 腾讯科技(深圳)有限公司 Augmented reality implementation method, apparatus and system, and computer device and storage medium
CN111385627A (en) * 2018-12-29 2020-07-07 中兴通讯股份有限公司 Augmented reality device, control method thereof and computer-readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150114058A (en) * 2014-03-31 2015-10-12 홍익대학교세종캠퍼스산학협력단 On/off line 3D printing robot battle game system invoked augmented reality and Robot battle game method using the same
CN107222529A (en) * 2017-05-22 2017-09-29 北京邮电大学 Augmented reality processing method, WEB modules, terminal and cloud server
WO2019179331A1 (en) * 2018-03-22 2019-09-26 腾讯科技(深圳)有限公司 Augmented reality implementation method, apparatus and system, and computer device and storage medium
CN208049372U (en) * 2018-04-16 2018-11-06 上海触展智能设备有限公司 A kind of AR double screens all-in-one machine
CN109567309A (en) * 2018-12-06 2019-04-05 广景视睿科技(深圳)有限公司 A kind of air navigation aid and intelligent shoe based on augmented reality
CN111385627A (en) * 2018-12-29 2020-07-07 中兴通讯股份有限公司 Augmented reality device, control method thereof and computer-readable storage medium

Also Published As

Publication number Publication date
CN111986333A (en) 2020-11-24

Similar Documents

Publication Publication Date Title
JP6560740B2 (en) Method, apparatus, program, and recording medium for testing virtual reality head mounted display device software
US20220245859A1 (en) Data processing method and electronic device
WO2021184952A1 (en) Augmented reality processing method and apparatus, storage medium, and electronic device
US20230274471A1 (en) Virtual object display method, storage medium and electronic device
CN111784765B (en) Object measurement method, virtual object processing method, virtual object measurement device, virtual object processing device, medium and electronic equipment
CN111311757B (en) Scene synthesis method and device, storage medium and mobile terminal
CN205283744U (en) Display device
JP2022550948A (en) 3D face model generation method, device, computer device and computer program
CN111243105B (en) Augmented reality processing method and device, storage medium and electronic equipment
CN112270754A (en) Local grid map construction method and device, readable medium and electronic equipment
CN114339194A (en) Projection display method and device, projection equipment and computer readable storage medium
CN109640070A (en) A kind of stereo display method, device, equipment and storage medium
CN112270702A (en) Volume measurement method and device, computer readable medium and electronic equipment
CN110152293B (en) Method and device for positioning control object and method and device for positioning game object
KR101784095B1 (en) Head-mounted display apparatus using a plurality of data and system for transmitting and receiving the plurality of data
CN108932055B (en) Method and equipment for enhancing reality content
CN112446251A (en) Image processing method and related device
CN108234978B (en) A kind of image processing method and mobile terminal
CN111986333B (en) Image generation method, device, terminal and storage medium based on augmented reality
CN110852132B (en) Two-dimensional code space position confirmation method and device
CN113014960B (en) Method, device and storage medium for online video production
US20240046560A1 (en) Three-Dimensional Model Reconstruction Method, Device, and Storage Medium
WO2022227875A1 (en) Three-dimensional imaging method, apparatus, and device, and storage medium
CN108765321A (en) It takes pictures restorative procedure, device, storage medium and terminal device
CN115147524A (en) 3D animation generation method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant