CN108364324B - Image data processing method and device and electronic terminal - Google Patents

Image data processing method and device and electronic terminal Download PDF

Info

Publication number
CN108364324B
CN108364324B CN201810060339.9A CN201810060339A CN108364324B CN 108364324 B CN108364324 B CN 108364324B CN 201810060339 A CN201810060339 A CN 201810060339A CN 108364324 B CN108364324 B CN 108364324B
Authority
CN
China
Prior art keywords
image
texture
processing
client
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810060339.9A
Other languages
Chinese (zh)
Other versions
CN108364324A (en
Inventor
汤锦鹏
马妙魁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Chengyun Technology Innovation Co.,Ltd.
Original Assignee
Hangzhou Orange Cloud Technology Innovation Service Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Orange Cloud Technology Innovation Service Co ltd filed Critical Hangzhou Orange Cloud Technology Innovation Service Co ltd
Priority to CN201810060339.9A priority Critical patent/CN108364324B/en
Publication of CN108364324A publication Critical patent/CN108364324A/en
Application granted granted Critical
Publication of CN108364324B publication Critical patent/CN108364324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention provides an image data processing method, an image data processing device and an electronic terminal, wherein the image data processing method comprises the following steps: the webpage side calls an Augmented Reality (AR) algorithm of the client side according to the image drawing instruction, and executes a texture binding method of the context calling client side through a webpage code; the webpage end triggers the client end to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader; the webpage end respectively acquires texture data of the image from the target texture unit and acquires the result processed by the AR algorithm from the shader; and the webpage end carries out image processing on the image according to the texture data and the result of the AR algorithm processing.

Description

Image data processing method and device and electronic terminal
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to an image data processing method and device and an electronic terminal.
Background
The AR (Augmented Reality) technology is a new technology that integrates real world information and virtual world information "seamlessly", and superimposes information (such as visual information, sound information, and the like) that is originally hard to experience within a certain time range of the real world, into real information after simulation, and real environment and virtual objects are superimposed onto the same picture in real time or exist at the same time. With the development of AR technology, WebAR, which carries AR technology through web pages, is also widely used. However, whether AR or WebAR, image data acquired by an image acquisition device such as a camera is the basis for AR processing.
For WebAR, in order to enable a web page to enhance a real scene acquired by a camera, a currently common mode is that a Native terminal (client) directly draws image data acquired by the camera, and places the image data under a web page view of the web page, such as a WebView level, but in this mode, the web page terminal cannot acquire and process the image data acquired by the camera, and if a filter is added to an image through the web page or other image processing is performed, a code of the Native terminal needs to be modified, so that the advantages of dynamic deployment and modification of the WebAR are lost.
Therefore, how to effectively implement the processing of the web page end on the image data, such as the AR processing, becomes a problem to be solved urgently.
Disclosure of Invention
In view of this, embodiments of the present invention provide an image data processing method, an image data processing device, and an electronic terminal, so as to effectively implement processing of image data by a web page.
According to a first aspect of embodiments of the present invention, there is provided an image data processing method including: the webpage side calls an Augmented Reality (AR) algorithm of the client side according to the image drawing instruction, and executes a texture binding method of the context calling client side through a webpage code; the webpage end triggers the client end to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader; the webpage end respectively acquires texture data of the image from the target texture unit and acquires the result processed by the AR algorithm from the shader; and the webpage end carries out image processing on the image according to the texture data and the result of the AR algorithm processing.
According to a second aspect of the embodiments of the present invention, there is provided an image data processing apparatus, which is provided at a web page side, the apparatus including: the calling module is used for calling an Augmented Reality (AR) algorithm of the client according to the image drawing instruction and executing a texture binding method of the client through a webpage code; the triggering module is used for triggering the client to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader; an obtaining module, configured to obtain texture data of the image from the target texture unit, and obtain a result of the AR algorithm processing from the shader; and the processing module is used for carrying out image processing on the image according to the texture data and the result of the AR algorithm processing.
According to a third aspect of embodiments of the present invention, there is provided an electronic terminal, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus; the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the image data processing method according to the first aspect.
According to the scheme provided by the embodiment of the invention, on one hand, the webpage end and the client end use the same texture unit, when the texture binding method of the client end is called by the webpage end, the client end stores the texture data of the image acquired by the image acquisition equipment into the texture unit, namely the target texture unit, and then the webpage end directly obtains the texture data stored by the client end from the target texture unit. On the other hand, when the acquired image is processed by the texture binding method, the webpage end also directly calls an AR algorithm of the client end to process the acquired image, and a processing result is stored in the set shader. Therefore, the webpage end can directly acquire the related data and information of the image to be drawn from the video memory of the image processor GPU so as to process the acquired image at the webpage end.
By adopting the scheme of the embodiment of the invention, on one hand, the webpage end can directly acquire the texture data of the acquired image, and further image processing such as AR processing and the like is carried out on the acquired image according to the processing result of the AR algorithm of the client end, and the processing can be realized through the webpage end without modifying the code of the client end, so that the advantages of dynamic deployment and random modification of WebAR are maintained; on the other hand, the webpage end and the client can directly carry out the interaction of the texture data of the image through the target texture unit, so that the data processing speed and efficiency are improved; on the other hand, the AR processing of the collected image is executed by the client, the execution speed of the client is high, the speed requirement of the AR processing can be effectively met, and the overall speed and efficiency of the image data are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and it is also possible for a person skilled in the art to obtain other drawings based on the drawings.
FIG. 1 is a flowchart illustrating steps of a method for processing image data according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of a method for processing image data according to a second embodiment of the present invention;
FIG. 3 is a block diagram of an image data processing apparatus according to a third embodiment of the present invention;
FIG. 4 is a block diagram of an image data processing apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic terminal according to a fifth embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following further describes specific implementation of the embodiments of the present invention with reference to the drawings.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image data processing method according to a first embodiment of the present invention is shown.
The image data processing method of the present embodiment includes the steps of:
step S102: and the webpage end calls an AR algorithm of the client according to the image drawing instruction, and executes a texture binding method of the context calling client through the webpage code.
With the development of computer technology, a component or an object for implementing a browser function, such as a WEBVIEW component, is usually embedded in an application APP, and such a component or object implements a corresponding browser function through an HTML page and a JavaScript language, that is, a Web page in the embodiment of the present invention may also be referred to as a Web page; the APP realizes other non-browser functions through other languages, such as Objective-C language, Java language, or other languages different from JavaScript, that is, the client in the embodiment of the present invention may also be referred to as a Native end.
The scheme of the embodiment of the invention is suitable for application scenes that the client and the webpage end can not directly transmit the image data acquired by the image acquisition equipment, such as WebAR image data based on an iOS platform, but is not limited to the WebAR image data, and other similar scenes are also suitable.
In the embodiment of the present invention, the client may be provided with one or more (two or more) AR algorithms, such as an AR algorithm for performing face detection, an AR algorithm for performing target object attribute detection, and the like. The specific AR algorithm may be implemented by those skilled in the art in any appropriate manner according to actual situations, and the embodiment of the present invention is not limited thereto. The AR algorithm set in the client is executed by the client code and can be directly or indirectly called by the webpage end.
The texture binding method of the client is usually called and realized by the client, but in the embodiment of the invention, the texture binding method is preset in a webpage code execution context such as JSContext for being called by the webpage end. Taking JSContext as an example, JSContext is a client tool class, encapsulates a JavaScript code execution context environment of a webpage end, and provides a method (Native code) of a client to JavaScript call of the webpage end through the JSContext, and when the JavaScript calls the method, the Native code is executed in the same program context. It should be understood by those skilled in the art that JSContext is merely an exemplary illustration and other similar context environments are equally applicable to the scheme provided by the embodiments of the present invention.
In the embodiment of the present invention, the texture binding method is used to bind texture data of an image acquired by an image acquisition device, such as a camera, to a specified current active texture unit, that is, a target texture unit in the embodiment of the present invention, by a client, so as to be acquired and used by a web page. Therefore, the webpage end and the client end can operate and use data in the same video memory.
When the web page needs to perform image processing such as AR image processing, an image drawing instruction is triggered to be generated, for example, when a certain AR processing option is triggered, a corresponding image drawing instruction is generated, and corresponding image data processing operation is triggered by the instruction, such as setting a rendering context, executing a context through web page code, and calling a texture binding method of the client.
It should be noted that, in the embodiment of the present invention, the step of calling the AR algorithm of the client by the web page end and the step of executing the texture binding method of the context calling client by the web page end through the web page code may be executed in no order or in parallel.
Step S104: the webpage end triggers the client end to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through a texture binding method of the client end; and performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing in a set shader.
The texture unit is a unit in the video memory for processing texture data, and as described above, the texture data of the image can be bound to the currently active texture unit by the texture binding method, so that the target texture unit needs to be activated first to write the texture data of the image. Wherein the target texture unit can be activated by a person skilled in the art in any suitable way, e.g. by the gl.
The web page terminal calls an AR algorithm of the client terminal, and after the client terminal executes corresponding codes to carry out AR algorithm processing on the image acquired by the image acquisition equipment, the processing result can be stored in a set shader. The shader is mostly arranged in the video memory and used for realizing image rendering. In many scenarios, the result of the AR algorithm may be data of a keypoint obtained by performing keypoint detection on a target object in an image, such as a keypoint coordinate, and therefore, in a feasible manner, the processing result of the AR algorithm may be stored in a shader for storing a position, such as a vertex shader. Without limitation, the result of some AR algorithms may be a simple detection result indicating whether the target object has some kind of attribute, which may be independent of coordinates or location, in which case, the result is not limited to a shader for storing locations, such as a vertex shader, but may also be stored in other suitable shaders, such as a shader for indicating how to draw texture data, such as a fragment shader.
It should be noted that, in the embodiment of the present invention, the web page triggers, by using the texture binding method of the client, the step of binding the texture data of the image acquired by the image acquisition device to the pre-activated target texture unit by the client, and the web page performs, by using the AR algorithm, the AR algorithm processing on the image acquired by the image acquisition device, and stores the result of the AR algorithm processing in the set shader.
Step S106: and the webpage end respectively acquires texture data of the image from a target texture unit and acquires the result of the AR algorithm processing from the shader.
The webpage end and the client operate and use data in the same video memory through the target texture unit, so that after the client writes texture data into the target texture unit, the webpage end can read the texture data written by the client from the target texture unit, and the webpage end can also obtain an AR algorithm processing result from a set shader for the subsequent webpage end to perform image rendering and processing.
It should be noted that, in the embodiment of the present invention, the step of the web page end obtaining texture data of the image from the target texture unit and the step of the web page end obtaining the result of the AR algorithm processing from the shader may be executed in no order or in parallel.
Step S108: and the webpage end carries out image processing on the image acquired by the image acquisition equipment according to the texture data and the result of the AR algorithm processing.
By the steps, the processing of the image acquired by the image acquisition equipment by the webpage end can be realized, such as filter processing, AR processing and the like. When image processing needs to be added or modified, only the code of the webpage end needs to be changed, the code of the client end does not need to be modified, and operations such as version upgrading of the client end are carried out.
The web page end can realize image processing according to the result of the AR algorithm processing and the texture data of the image, such as wearing virtual glasses at the positions of the eyes of the human body of the image or wearing a hat at the position of the head of the human body according to the processing result of the AR algorithm of the human face key point detection, and the like.
According to the embodiment, on one hand, the same texture unit is used by the webpage end and the client end, when the texture binding method of the client end is called by the webpage end, the client end stores the texture data of the image acquired by the image acquisition equipment into the texture unit, namely the target texture unit, and then the webpage end directly obtains the texture data stored by the client end from the target texture unit. On the other hand, when the acquired image is processed by the texture binding method, the webpage end also directly calls an AR algorithm of the client end to process the acquired image, and a processing result is stored in the set shader. Therefore, the webpage end can directly acquire the related data and information of the image to be drawn from the video memory of the image processor GPU so as to process the acquired image at the webpage end.
By adopting the scheme of the embodiment, on one hand, the webpage end can directly acquire the texture data of the acquired image, and further image processing such as AR processing and the like can be carried out on the acquired image according to the processing result of the AR algorithm of the client end, the processing can be realized through the webpage end without modifying the client end code, and the advantages of dynamic deployment and random modification of WebAR are maintained; on the other hand, the webpage end and the client can directly carry out the interaction of the texture data of the image through the target texture unit, so that the data processing speed and efficiency are improved; on the other hand, the AR processing of the collected image is executed by the client, the execution speed of the client is high, the speed requirement of the AR processing can be effectively met, and the overall speed and efficiency of the image data are improved.
The image data processing method of the present embodiment may be performed by any suitable terminal device having data processing capabilities, including but not limited to: mobile terminals, such as tablet computers, mobile phones, and desktop computers.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image data processing method according to a second embodiment of the present invention is shown.
The image data processing method of the present embodiment includes the steps of:
Step S202: setting a texture binding method of the client to the webpage code execution context.
The embodiment of the invention takes the case that the client uses OpenGL ES and the webpage end uses WebGL as an example. WebGL is a cross-platform 3D graphics API implemented based on OpenGL ES 2.0 using the JavaScript language. Based on the setting, in one example, the client obtains the JSContext of the web page, and sets a texture binding method, such as a bindtext () method, through the JSContext, where the texture binding method is executed by the client code. The texture binding method, such as the bindexture () method, is used to bind texture data, which is obtained from the client for the texture data of the image captured by the camera when the bindexture () method is called, and then binds the texture data to the GL texture unit.
The JSContext is a client tool class, encapsulates a webpage end JavaScript code execution context environment, can provide a method of a client to webpage end JavaScript calling through the JSContext, and when the JavaScript calls the method, Native codes are executed in the same program context. That is, through JSContext, the client code and the web page end code execute in the same rendering context.
It should be noted that, this step may be executed once at an initial time, for example, when the APP leaves a factory or when a program is updated or upgraded at a certain time, and may be continuously used subsequently.
Step S204: the webpage side acquires a rendering context according to the image drawing instruction, and sets a vertex shader, a fragment shader and a texture unit in the rendering context; and, acquiring algorithm information of the AR algorithm of the client corresponding to the image drawing instruction.
Wherein, the setting of the texture unit may be setting a label for the texture unit.
When the webpage end needs to perform image drawing, if AR processing is needed to be performed on the image acquired by the image acquisition equipment, an image drawing instruction is triggered, and the webpage end firstly performs corresponding rendering context setting according to the image drawing instruction; and then, executing a texture binding method of the context calling client through the webpage codes to acquire texture data for processing. Meanwhile, after the webpage end receives the image drawing instruction, the rendering context is set, and the algorithm information of the AR algorithm of the client corresponding to the image drawing instruction is determined for subsequent calling. The algorithm information of the AR algorithm includes but is not limited to: interface information, parameter information, address information, etc. of the AR algorithm. For example, when the image drawing instruction indicates to draw the virtual glasses on the face, the web page determines that the AR algorithm for detecting the key points of the face needs to be called, and first obtains the relevant algorithm information of the AR algorithm for calling.
Starting from OpenGL ES 2.0, the rendering process can be controlled by a vertex shader and a fragment shader, becoming a programmable rendering pipeline. Because WebGL is implemented based on OpenGL ES 2.0, in a feasible scheme, a web page can obtain GLContext (rendering context) of a canvas object of a web page, and set a GLContext environment of the rendering context of WebGL according to a WebGL standard application method, including: set vertex shader, set fragment shader, set texture Uniform variable used in fragment shader to the corresponding texel index, e.g., 0, and so on. As long as the webpage end and the client end are in the same rendering context (GLContext), the webpage end can operate data in the same video memory by using WebGL and the client end by using OpenGL.
The Canvas object represents an HTML Canvas element, WebGL defines a set of APIs to support graphics or image rendering, which can be performed on the Canvas object by JavaScript using the APIs of WebGL. GLContext is a rendering context used when the WebGL performs image rendering and rendering, a vertex shader in the GLContext can be set to set a rendering position of texture data to be rendered, a fragment shader can be set to set how to render the texture data, and a label of a texture unit indicates the address where a client and a webpage end operate together and use the texture data.
Through the setting of the step, the client and the webpage end can share the image data through the OpenGL texture unit, namely the target texture unit.
It should be noted that, this step may be executed once at the beginning of one image data processing, and may be continuously used in the current image data processing process until the next image data processing process. For example, the rendering context is set in the step executed in one AR service, and the rendering context is continuously used in the current AR service. When additional AR service or other image data processing operation is needed, the rendering context setting is carried out again for the corresponding operation.
It should be further noted that, in the embodiment of the present invention, the step of obtaining the rendering context by the web page, and setting the vertex shader, the fragment shader, and the texture unit in the rendering context, and the step of obtaining the algorithm information of the AR algorithm of the client corresponding to the image drawing instruction by the web page may be executed in no order or in parallel.
Step S206: a texture unit in the rendered context is activated and the activated texture unit is taken as a target texture unit.
For example, the web page calls the gl.activetexture () method, and passes to the TEXTURE unit set in step S204, e.g., the TEXTURE unit GL _ text 0 labeled 0. When the webpage end calls the gl.activetexture (texture) method of the WebGL, the client is triggered to set the rendering context of the webpage end as the currently activated rendering context, and the client can execute the functions of processing and drawing texture data at this time.
Through the steps S202 to S206, on one hand, the establishment and setting of the cross-end programmable rendering pipeline of the web page end and the client end are realized, and through the cross-end programmable rendering pipeline, the web page end and the client end can use texture units which are the same as those of the web page end and the client end in operation; on the other hand, the method also realizes the acquisition of the algorithm information of the AR algorithm at the webpage end, and the subsequent accurate calling is carried out through the algorithm information.
Based on the setting, the process that the webpage side acquires the image data from the client side and processes the image data is as follows:
step S208: the webpage end executes a texture binding method of the context calling client through the webpage code; and according to the obtained algorithm information of the AR algorithm, directly calling the AR algorithm corresponding to the image drawing instruction.
For example, the texture binding method bindstatus method of the client is called by the web page code execution context JSContext. In addition, in the present embodiment, an AR algorithm is taken as an example of a face key point detection algorithm.
It should be noted that, in the embodiment of the present invention, the web page executes the step of invoking the texture binding method of the client by using the context through the web page code, and the web page directly invokes the step of invoking the AR algorithm corresponding to the image rendering instruction according to the acquired algorithm information of the AR algorithm, which may be executed in no order or in parallel.
Step S210: the webpage end triggers the client end to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; and performing AR algorithm processing on the image acquired by the image acquisition equipment through an AR algorithm, and storing the result of the AR algorithm processing into a vertex shader in the video memory.
In this embodiment, a shader for storing the result of the AR algorithm processing in a position for storing texture data to be rendered, such as a vertex shader, is taken as an example.
For example, in one aspect, the web page side calls the bindTexture method, the Native code is executed, the Native code execution is in the same rendering context as the web page side, and the currently activated texture unit is the texture unit activated in step S206. After the bindTexture method is called, the client acquires the image data collected by the camera, then calls the glTexImage2D () method, copies the collected image data to the currently activated texture unit, and then ends the Native code call. On the other hand, the webpage end directly calls a face key point detection algorithm of the client end to detect key points of the face in the image, and the coordinates of the detected face key points are stored in a vertex shader of the video memory.
In some application scenarios, the web page end can also trigger the client to perform image processing on the image acquired by the image acquisition device through the texture binding method, and bind texture data of the image after the image processing to a pre-activated target texture unit. Optionally, the image processing comprises filter processing. For example, after the client acquires the image data acquired by the camera, Native filter or other image processing may be performed on the acquired image data, and then the gltexmmage 2D () method is called to copy the acquired image data to the currently activated texture unit. In this case, the subsequent web page side performs image rendering and processing on the web page side based on the processed image data.
It should be noted that, in the embodiment of the present invention, the web page triggers, by using the texture binding method, the step of binding the texture data of the image acquired by the image acquisition device to the pre-activated target texture unit by the client, and the web page performs, by using an AR algorithm, AR algorithm processing on the image acquired by the image acquisition device, and stores the result of the AR algorithm processing in the vertex shader in the video memory, which may be executed in no order or in parallel.
In addition, in a feasible mode, the webpage end can also store the image acquired by the image acquisition equipment to the set position of the client end by calling an image data storage method of the client end. The image data storage method may be any suitable method that can be used for data storage or image data storage, and the specific implementation of the method may be implemented by those skilled in the art in any suitable manner according to actual needs, which is not limited in this embodiment of the present invention. The setting position of the client may be set by those skilled in the art as appropriate according to actual needs. That is, while writing the texture data of the image collected by the image collecting device into the target texture unit, the image can be stored locally at the client. The step of storing the image acquired by the image acquisition device to the set position of the client by calling the image data storage method of the client at the webpage side can be executed at any time after the image acquisition device acquires the image and before the image is synthesized, and is not limited to being executed at the same time when the texture data is bound to the target texture unit. However, the method of executing the step of binding the texture data to the target texture unit at the same time can make the whole process of image data processing more compact and reasonable, and can reduce the cost of development and implementation.
Step S212: and the webpage end respectively acquires texture data of the image from the target texture unit and acquires the result processed by the AR algorithm from the vertex shader.
For example, the webpage side obtains texture data from the target texture unit through the JS code, and obtains the processing result of the AR algorithm from the vertex shader.
The texture data of the image is obtained from the target texture unit, and the execution of the result of the AR algorithm processing obtained from the vertex shader can be performed without the sequence or in parallel.
Step S214: and the webpage end carries out image processing on the image acquired by the image acquisition equipment according to the texture data and the result of AR algorithm processing.
After the client executes the texture binding method and the AR algorithm and stores the corresponding processing results into the corresponding target texture unit and the vertex shader, the code flow returns to the webpage end, the webpage end can continue to execute own code logic, texture data of the image is obtained from the target texture unit, the AR algorithm processing results, such as coordinates of key points of the face, are obtained from the vertex shader, rendering processing of the image is carried out, such as filter processing, mapping processing or other image processing, and the like, and finally the image is rendered.
By the steps, the image data acquired by the image acquisition equipment is transmitted to the webpage end for WebGL rendering, and the webpage end can perform any appropriate rendering processing on the image data according to the processing result of the AR algorithm and the actual image processing requirement. In addition, if the client performs filter processing on the acquired image through the texture binding method and the webpage side also performs filter image processing on the acquired image, the image processing of overlapping the client filter and the webpage side filter can be realized, and the effect of taking both performance and dynamic image processing into consideration is achieved. In addition, the webpage end can process the collected image, so that the enhancement of the WebAR to the real scene can be effectively realized, namely the WebAR effect is effectively realized.
Therefore, the cross-end programmable rendering pipeline completes the flow from the client to the webpage end and then to the final rendering of the image data acquired by the image acquisition equipment, and both the client and the webpage end can perform programmable control on the cross-end programmable rendering pipeline. In specific practice, vertex and texture calculation which consumes CPU resources and is not suitable for JavaScript script operation can be put into a texture binding method, processing is carried out by a client, and a GPU filter needing vertex shader and fragment shader participation is put into a webpage for processing, so that the image data processing speed and efficiency are improved.
Through the process, a cross-end programmable rendering pipeline scheme is realized, the OpenGL ES codes of the client and the WebGL codes of the webpage end can be executed in the same rendering context, namely the client and the webpage end are in the same rendering pipeline, and the same texture data can be processed. At the moment, the client can submit the texture data of the image to the target texture unit, the webpage end directly renders the texture data of the image by using WebGL, and further AR rendering is carried out according to the result of AR algorithm processing, so that the performance problem of image data processing is solved, and the dynamic property of the webpage is maintained.
Compared with the mode that the client defines a corresponding interface, the webpage end JS obtains the image data stored in the TypedArray mode by calling the interface, then submits the TypedArray data to the WebGL, and the WebGL performs drawing, the scheme provided by the embodiment of the invention does not need the client to package each frame of image data into TypedArray, greatly improves the data processing performance, and can effectively meet the requirement of WebAR processing; for the iOS platform, the image data is packaged into TypedArray and transmitted to the webpage end, the image data can be realized only by firmware above iOS 10.0 under a certain condition, and the scheme of the embodiment of the invention has no firmware requirement, thereby greatly improving the flexibility of image data processing and reducing the processing cost.
In addition, if the user has further operation requirements on the image processed by the webpage end, for example, when the image processed by the webpage end, such as the image processed by the AR, is stored or shared or reproduced or otherwise operated, the following optional operations can be performed.
Step S216: and the webpage end acquires and stores the process data of image processing performed by the webpage end on the image acquired by the image acquisition equipment.
Wherein the process data is used to record the process of the image processing, including but not limited to: the order of steps of image processing, execution time, data and methods used in each step, and the like.
Step S218: and calling a data synthesis method of the client by the webpage end, acquiring the image acquired by the image acquisition equipment from the set position by the data synthesis method, and acquiring the process data.
The data synthesis method is realized through a client, namely, called by a webpage end and executed through a client code. The specific implementation of the data synthesis method may be implemented by those skilled in the art in any appropriate manner according to actual needs, and the embodiment of the present invention is not limited thereto, for example, the data synthesis is implemented by a video coding manner.
As described in step S210, the image acquired by the image acquisition device is also locally stored in the client, and based on the image and the process data locally stored in the client, the restoration of the image processing can be realized.
Step S220: and the webpage end carries out synthesis processing on the acquired image and the acquired process data through a data synthesis method to generate a synthesis result.
For example, the client locally stores a video frame including a plurality of frames of face images used in a certain image processing process, and process data of an AR processing process performed on the video frame, such as AR mapping processing performed on each frame of face images, and the like, and after acquiring the data, synthesizes the AR mapping processing of each frame with a corresponding frame of face images, thereby restoring the AR mapping to the video frame including the face images.
It should be noted that the synthesis process may adopt an off-line synthesis mode, which not only saves network resources, but also ensures a faster synthesis speed. When the off-line synthesis mode is adopted, after the synthesis result is generated, the synthesis result is sent to the webpage end.
Step S222: and the webpage end receives an operation instruction for the synthesis result and operates the synthesis result according to the operation instruction.
The application of the client and the webpage end can provide one or more operation options for the combined result for the user, such as a saving option, an additional saving option, a sharing option, a sending option and the like. And when the user selects a certain option, performing corresponding operation on the synthesis result. For example, when the user selects the sharing option, the composite result may be shared to a target object specified by the user, such as a circle of friends. Therefore, the experience of the user using WebAR is further improved.
Through the above process, after data synthesis (for example, synthesis in a video coding manner) is performed through the client, the synthesized multi-frame image data or video is transmitted back to the web page, and then corresponding operations, such as the above sharing operation, the local saving operation, the uploading operation, and the like, are performed through the web page according to the selection of the user.
Therefore, according to the embodiment, on one hand, the same texture unit is used by the webpage end and the client end, when the texture binding method of the client end is called by the webpage end, the client end stores the texture data of the image acquired by the image acquisition equipment into the texture unit, namely the target texture unit, and then the webpage end directly obtains the texture data stored by the client end from the target texture unit. On the other hand, when the acquired image is processed by the texture binding method, the webpage end also directly calls an AR algorithm of the client end to process the acquired image, and a processing result is stored in the set shader. Therefore, the webpage end can directly acquire the related data and information of the image to be drawn from the video memory of the image processor GPU so as to process the acquired image at the webpage end.
By adopting the scheme of the embodiment, on one hand, the webpage end can directly acquire the texture data of the acquired image, and further image processing such as AR processing and the like can be carried out on the acquired image according to the processing result of the AR algorithm of the client end, the processing can be realized through the webpage end without modifying the client end code, and the advantages of dynamic deployment and random modification of WebAR are maintained; on the other hand, the webpage end and the client can directly carry out the interaction of the texture data of the image through the target texture unit, so that the data processing speed and efficiency are improved; on the other hand, the AR processing of the collected image is executed by the client, the execution speed of the client is high, the speed requirement of the AR processing can be effectively met, and the overall speed and efficiency of the image data are improved.
The image data processing method of the present embodiment may be performed by any suitable terminal device having data processing capabilities, including but not limited to: mobile terminals, such as tablet computers, mobile phones, and desktop computers.
EXAMPLE III
Referring to fig. 3, there is shown a block diagram of an image data processing apparatus according to a third embodiment of the present invention.
The image data processing apparatus of this embodiment is disposed at a web page side, and the apparatus includes: the calling module 302 is used for calling an AR algorithm of the client according to the image drawing instruction, and executing a texture binding method of the client through a webpage code; the triggering module 304 is configured to trigger the client to bind texture data of an image acquired by the image acquisition device to a target texture unit activated in advance by using a texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader; an obtaining module 306, configured to obtain texture data of an image from a target texture unit, and obtain a result of processing by an AR algorithm from the shader; and the processing module 308 is configured to perform image processing on the image according to the texture data and the result of the AR algorithm processing.
According to the embodiment, on one hand, the same texture unit is used by the webpage end and the client end, when the texture binding method of the client end is called by the webpage end, the client end stores the texture data of the image acquired by the image acquisition equipment into the texture unit, namely the target texture unit, and then the webpage end directly obtains the texture data stored by the client end from the target texture unit. On the other hand, when the acquired image is processed by the texture binding method, the webpage end also directly calls an AR algorithm of the client end to process the acquired image, and a processing result is stored in the set shader. Therefore, the webpage end can directly acquire the related data and information of the image to be drawn from the video memory of the image processor GPU so as to process the acquired image at the webpage end.
By adopting the scheme of the embodiment, on one hand, the webpage end can directly acquire the texture data of the acquired image, and further image processing such as AR processing and the like can be carried out on the acquired image according to the processing result of the AR algorithm of the client end, the processing can be realized through the webpage end without modifying the client end code, and the advantages of dynamic deployment and random modification of WebAR are maintained; on the other hand, the webpage end and the client can directly carry out the interaction of the texture data of the image through the target texture unit, so that the data processing speed and efficiency are improved; on the other hand, the AR processing of the collected image is executed by the client, the execution speed of the client is high, the speed requirement of the AR processing can be effectively met, and the overall speed and efficiency of the image data are improved.
Example four
Referring to fig. 4, there is shown a block diagram of an image data processing apparatus according to a fourth embodiment of the present invention.
The image data processing apparatus of this embodiment is disposed at a web page side, and the apparatus includes: a calling module 402, configured to call an AR algorithm of the client according to the image drawing instruction, and execute a texture binding method of the client through a web code execution context; a triggering module 404, configured to trigger the client to bind texture data of an image acquired by the image acquisition device to a pre-activated target texture unit by using a texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through an AR algorithm, and storing the result of the AR algorithm processing into a set shader; an obtaining module 406, configured to obtain texture data of an image from a target texture unit, and obtain a result of processing by an AR algorithm from the shader; and the processing module 408 is configured to perform image processing on the image according to the texture data and the result of the AR algorithm processing.
Optionally, the image data processing apparatus of this embodiment further includes: the setting module 410 is configured to obtain a rendering context and set a vertex shader, a fragment shader, and a texture unit in the rendering context before the invoking module 402 executes the texture binding method of the context invoking client through the web page code.
Optionally, the setup module 410 sets a label to a texture unit in the rendering context.
Optionally, the image data processing apparatus of this embodiment further includes: and an activation module 412, configured to activate a texture unit in the rendering context after the setup module 410 sets the vertex shader, the fragment shader, and the texture unit in the rendering context, and use the activated texture unit as a target texture unit.
Optionally, the set shader is a vertex shader.
Optionally, the image data processing apparatus of this embodiment further includes: and the storage module 414 is used for storing the image acquired by the image acquisition device to the set position of the client by calling the image data storage method of the client.
Optionally, the storage module 414 is further configured to obtain and save process data of the image processing after the processing module 408 performs the image processing on the image.
Optionally, the image data processing apparatus of this embodiment further includes: a synthesizing module 416, configured to, after the storing module 414 obtains and stores the process data of the image processing, invoke a data synthesizing method of the client, obtain the image from the set position by the data synthesizing method, and obtain the process data; and synthesizing the acquired image and the acquired process data by a data synthesis method to generate a synthesis result.
Optionally, the image data processing apparatus of this embodiment further includes: an operation module 418, configured to receive an operation instruction on the synthesis result after the synthesis module 416 generates the synthesis result, and perform an operation on the synthesis result according to the operation instruction.
The image data processing apparatus of this embodiment is used to implement the corresponding image data processing method in the foregoing method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again.
EXAMPLE five
Referring to fig. 5, a schematic structural diagram of an electronic terminal according to a fifth embodiment of the present invention is shown, and the specific embodiment of the present invention does not limit the specific implementation of the electronic terminal.
As shown in fig. 5, the electronic terminal may include: a processor (processor)502, a Communications Interface 504, a memory 506, and a communication bus 508.
Wherein:
the processor 502, communication interface 504, and memory 506 communicate with one another via a communication bus 508.
A communication interface 504 for communicating with other electronic terminals or servers.
The processor 502 is configured to execute the program 510, and may specifically execute relevant steps in the above-described embodiment of the image data processing method.
In particular, program 510 may include program code that includes computer operating instructions.
The processor 502 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement an embodiment of the present invention. The electronic terminal comprises one or more processors, which can be the same type of processor, such as one or more CPUs; or may be different types of processors such as one or more CPUs and one or more ASICs.
And a memory 506 for storing a program 510. The memory 506 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The program 510 may specifically be used to cause the processor 502 to perform the following operations: instructing the webpage end to call an AR algorithm of the client according to the image drawing instruction, and executing a texture binding method of the context calling client through a webpage code; the method comprises the steps that a webpage end is indicated to trigger a client to bind texture data of an image collected by an image collecting device into a target texture unit which is activated in advance through a texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader; indicating a webpage end to respectively acquire texture data of an image from a target texture unit and acquiring an AR algorithm processing result from the shader; and indicating the webpage end to perform image processing on the image according to the texture data and the result of the AR algorithm processing.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to instruct the webpage side to obtain the rendering context and set a vertex shader, a fragment shader, and a texture unit in the rendering context before instructing the webpage side to execute the texture binding method of the context call client through the webpage code.
In an alternative embodiment, the program 510 is further configured to cause the processor 502 to set a reference number for a texture unit in a rendering context when the webpage end is instructed to set the texture unit.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to activate a texture unit in the rendering context after instructing the webpage end to set the vertex shader, the fragment shader, and the texture unit in the rendering context, and to take the activated texture unit as the target texture unit.
In an alternative embodiment, the set shader is the vertex shader.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to instruct the web page to store the image captured by the image capturing device to a set location of the client by calling an image data storage method of the client.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to instruct the web page to obtain and save the process data of the image processing after the image processing is performed on the image.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to instruct the web page side to call a data synthesis method of the client side after acquiring and saving the process data of the image processing, acquire the image from the set position by the data synthesis method, and acquire the process data; and synthesizing the acquired image and the acquired process data by a data synthesis method to generate a synthesis result.
In an alternative embodiment, the program 510 is further configured to enable the processor 502 to instruct the web page side to receive an operation instruction on the synthesized result after generating the synthesized result, and operate on the synthesized result according to the operation instruction.
For specific implementation of each step in the program 510, reference may be made to corresponding steps and corresponding descriptions in units in the foregoing embodiments of the image data processing method, and details are not described here again. It may be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described devices and modules may refer to the corresponding process descriptions in the foregoing method embodiments, and are not described herein again.
According to the embodiment, on one hand, the same texture unit is used by the webpage end and the client end, when the texture binding method of the client end is called by the webpage end, the client end stores the texture data of the image acquired by the image acquisition equipment into the texture unit, namely the target texture unit, and then the webpage end directly obtains the texture data stored by the client end from the target texture unit. On the other hand, when the acquired image is processed by the texture binding method, the webpage end also directly calls an AR algorithm of the client end to process the acquired image, and a processing result is stored in the set shader. Therefore, the webpage end can directly acquire the related data and information of the image to be drawn from the video memory of the image processor GPU so as to process the acquired image at the webpage end.
By adopting the electronic terminal of the embodiment, on one hand, the webpage end can directly acquire the texture data of the acquired image, and further image processing such as AR processing and the like can be carried out on the acquired image according to the processing result of the AR algorithm of the client end, and the processing can be realized through the webpage end without modifying the client end code, so that the advantages of dynamic deployment and random modification of WebAR are maintained; on the other hand, the webpage end and the client can directly carry out the interaction of the texture data of the image through the target texture unit, so that the data processing speed and efficiency are improved; on the other hand, the AR processing of the collected image is executed by the client, the execution speed of the client is high, the speed requirement of the AR processing can be effectively met, and the overall speed and efficiency of the image data are improved.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present invention may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present invention.
The above-described method according to an embodiment of the present invention may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium downloaded through a network and to be stored in a local recording medium, so that the method described herein may be stored in such software processing on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that the computer, processor, microprocessor controller or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the image data processing methods described herein. Further, when a general-purpose computer accesses code for implementing the image data processing method shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the image data processing method shown herein.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The above embodiments are only for illustrating the embodiments of the present invention and not for limiting the embodiments of the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also belong to the scope of the embodiments of the present invention, and the scope of patent protection of the embodiments of the present invention should be defined by the claims.

Claims (19)

1. An image data processing method, comprising:
the webpage side calls an AR algorithm of the client side according to the image drawing instruction, and executes a texture binding method of the context calling client side through webpage codes;
The webpage end triggers the client end to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader;
the webpage end respectively acquires texture data of the image from the target texture unit and acquires the result processed by the AR algorithm from the shader;
and the webpage end carries out image processing on the image according to the texture data and the result of the AR algorithm processing.
2. The method of claim 1, wherein before the web page side executes the context call client-side texture binding method through web page code, the method further comprises:
the webpage side obtains a rendering context, and sets a vertex shader, a fragment shader and a texture unit in the rendering context.
3. The method of claim 2, wherein setting texture units in the rendering context comprises:
labels are provided for the texture elements.
4. The method of claim 2 or 3, wherein after the setting of a vertex shader, a fragment shader, and a texture unit in the rendered context, the method further comprises:
a texture unit in the rendered context is activated and the activated texture unit is taken as the target texture unit.
5. A method according to any one of claims 2 or 3, wherein the set shader is the vertex shader.
6. The method according to any one of claims 1-3, wherein the method further comprises:
and the webpage end stores the image acquired by the image acquisition equipment to a set position of the client end by calling an image data storage method of the client end.
7. The method of claim 6, wherein after said image processing the image, the method further comprises:
and the webpage end acquires and stores the process data of the image processing.
8. The method of claim 7, wherein after the web page obtains and saves the image processing procedure data, the method further comprises:
the webpage terminal calls a data synthesis method of the client terminal, acquires the image from the set position through the data synthesis method, and acquires the process data;
And the webpage end carries out synthesis processing on the acquired image and the acquired process data through the data synthesis method to generate a synthesis result.
9. The method of claim 8, wherein after the generating the composite result, the method further comprises:
and the webpage end receives an operation instruction for the synthesis result and operates the synthesis result according to the operation instruction.
10. An image data processing device, the device is arranged at a webpage end, and the device comprises:
the calling module is used for calling an AR algorithm of the client according to the image drawing instruction and executing a texture binding method of the client through a webpage code;
the triggering module is used for triggering the client to bind the texture data of the image acquired by the image acquisition equipment into a target texture unit which is activated in advance through the texture binding method; performing AR algorithm processing on the image acquired by the image acquisition equipment through the AR algorithm, and storing the result of the AR algorithm processing into a set shader;
an obtaining module, configured to obtain texture data of the image from the target texture unit, and obtain a result of the AR algorithm processing from the shader;
And the processing module is used for carrying out image processing on the image according to the texture data and the result of the AR algorithm processing.
11. The apparatus of claim 10, wherein the apparatus further comprises:
the setting module is used for acquiring the rendering context and setting a vertex shader, a fragment shader and a texture unit in the rendering context before the calling module executes the texture binding method of the context calling client through the webpage code.
12. The apparatus of claim 11, wherein the setting module sets a label for a texture unit in the rendering context.
13. The apparatus of claim 11 or 12, wherein the apparatus further comprises:
and the activation module is used for activating the texture unit in the rendering context after the setting module sets the vertex shader, the fragment shader and the texture unit in the rendering context, and taking the activated texture unit as the target texture unit.
14. The apparatus according to any one of claims 11 or 12, wherein the set shader is the vertex shader.
15. The apparatus of any of claims 10-12, wherein the apparatus further comprises:
And the storage module is used for storing the image acquired by the image acquisition equipment to the set position of the client by calling the image data storage method of the client.
16. The apparatus of claim 15, wherein the storage module is further configured to obtain and save process data of the image processing after the image processing module performs the image processing on the image.
17. The apparatus of claim 16, wherein the apparatus further comprises:
the synthesis module is used for calling a data synthesis method of the client after the storage module acquires and stores the process data of the image processing, acquiring the image from the set position by the data synthesis method and acquiring the process data; and synthesizing the acquired image and the acquired process data by the data synthesis method to generate a synthesis result.
18. The apparatus of claim 17, wherein the apparatus further comprises:
and the operation module is used for receiving an operation instruction of the synthesis result after the synthesis module generates the synthesis result, and operating the synthesis result according to the operation instruction.
19. An electronic terminal, comprising: the system comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete mutual communication through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction causes the processor to execute the operation corresponding to the image data processing method according to any one of claims 1-9.
CN201810060339.9A 2018-01-22 2018-01-22 Image data processing method and device and electronic terminal Active CN108364324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810060339.9A CN108364324B (en) 2018-01-22 2018-01-22 Image data processing method and device and electronic terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810060339.9A CN108364324B (en) 2018-01-22 2018-01-22 Image data processing method and device and electronic terminal

Publications (2)

Publication Number Publication Date
CN108364324A CN108364324A (en) 2018-08-03
CN108364324B true CN108364324B (en) 2021-10-08

Family

ID=63006853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810060339.9A Active CN108364324B (en) 2018-01-22 2018-01-22 Image data processing method and device and electronic terminal

Country Status (1)

Country Link
CN (1) CN108364324B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109194960B (en) * 2018-11-13 2020-12-18 北京奇艺世纪科技有限公司 Image frame rendering method and device and electronic equipment
CN111651079B (en) * 2020-05-18 2023-09-29 广州视源电子科技股份有限公司 Handwriting display method, device, equipment and computer storage medium
CN113095255B (en) * 2021-04-20 2024-01-12 京东科技控股股份有限公司 Image data distribution method, device, multicast server and medium
CN116980680A (en) * 2023-09-22 2023-10-31 浙江华创视讯科技有限公司 Electronic nameplate display method, terminal equipment and computer storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040258308A1 (en) * 2003-06-19 2004-12-23 Microsoft Corporation Automatic analysis and adjustment of digital images upon acquisition
CN106412229A (en) * 2015-07-28 2017-02-15 阿里巴巴集团控股有限公司 Interaction method and device for mobile terminal, and the mobile terminal
CN106846495A (en) * 2017-01-17 2017-06-13 腾讯科技(深圳)有限公司 Realize the method and apparatus of augmented reality
CN107154063A (en) * 2017-04-19 2017-09-12 腾讯科技(深圳)有限公司 The shape method to set up and device in image shows region
CN107247548A (en) * 2017-05-31 2017-10-13 腾讯科技(深圳)有限公司 Method for displaying image, image processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040258308A1 (en) * 2003-06-19 2004-12-23 Microsoft Corporation Automatic analysis and adjustment of digital images upon acquisition
CN106412229A (en) * 2015-07-28 2017-02-15 阿里巴巴集团控股有限公司 Interaction method and device for mobile terminal, and the mobile terminal
CN106846495A (en) * 2017-01-17 2017-06-13 腾讯科技(深圳)有限公司 Realize the method and apparatus of augmented reality
CN107154063A (en) * 2017-04-19 2017-09-12 腾讯科技(深圳)有限公司 The shape method to set up and device in image shows region
CN107247548A (en) * 2017-05-31 2017-10-13 腾讯科技(深圳)有限公司 Method for displaying image, image processing method and device

Also Published As

Publication number Publication date
CN108364324A (en) 2018-08-03

Similar Documents

Publication Publication Date Title
CN108364324B (en) Image data processing method and device and electronic terminal
US20200364937A1 (en) System-adaptive augmented reality
WO2021008166A1 (en) Method and apparatus for virtual fitting
CN111414225B (en) Three-dimensional model remote display method, first terminal, electronic device and storage medium
CN106846495B (en) Method and device for realizing augmented reality
KR20210151114A (en) Hybrid rendering
CN107223270B (en) Display data processing method and device
US20170178396A1 (en) Generating virtual shadows for displayable elements
CN111950056B (en) BIM display method and related equipment for building informatization model
CN111507352B (en) Image processing method and device, computer equipment and storage medium
CN114494328B (en) Image display method, device, electronic equipment and storage medium
US11662580B2 (en) Image display method, apparatus, and system to reduce display latency
WO2017113729A1 (en) 360-degree image loading method and loading module, and mobile terminal
CN108335342B (en) Method, apparatus and computer program product for multi-person drawing on a web browser
CN112379815A (en) Image capturing method and device, storage medium and electronic equipment
CN114445500A (en) Augmented reality scene construction method and device, terminal equipment and storage medium
CN108363742B (en) Image data processing method and device and electronic terminal
CN112188087B (en) Panoramic video screenshot method and device, storage medium and computer equipment
CN113469883B (en) Rendering method and device of dynamic resolution, electronic equipment and readable storage medium
US10460503B2 (en) Texturing of a three-dimensional (3D) model by UV map in-painting
CN114419226A (en) Panorama rendering method and device, computer equipment and storage medium
CN111242838B (en) Blurred image rendering method and device, storage medium and electronic device
CN116503529A (en) Rendering, 3D picture control method, electronic device, and computer-readable storage medium
CN109857568B (en) Data encapsulation and processing method and device and electronic equipment
CN113837918A (en) Method and device for realizing rendering isolation by multiple processes

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200522

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 510627 Guangdong city of Guangzhou province Whampoa Tianhe District Road No. 163 Xiping Yun Lu Yun Ping B radio square 14 storey tower

Applicant before: Guangzhou Dongjing Computer Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210817

Address after: 311100 room 407-01, 4th floor, building 2, No. 2699, yuhangtang Road, Cangqian street, Yuhang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou orange cloud Technology Innovation Service Co.,Ltd.

Address before: Room 508, 5 / F, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: Alibaba (China) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 311100 room 407-01, 4th floor, building 2, No. 2699, yuhangtang Road, Cangqian street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Chengyun Technology Innovation Co.,Ltd.

Address before: 311100 room 407-01, 4th floor, building 2, No. 2699, yuhangtang Road, Cangqian street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee before: Hangzhou orange cloud Technology Innovation Service Co.,Ltd.

CP01 Change in the name or title of a patent holder