CN115809046A - Data processing method, device, equipment and storage medium - Google Patents

Data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN115809046A
CN115809046A CN202211518525.5A CN202211518525A CN115809046A CN 115809046 A CN115809046 A CN 115809046A CN 202211518525 A CN202211518525 A CN 202211518525A CN 115809046 A CN115809046 A CN 115809046A
Authority
CN
China
Prior art keywords
function
layer
application
application program
target application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211518525.5A
Other languages
Chinese (zh)
Inventor
董晓飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202211518525.5A priority Critical patent/CN115809046A/en
Publication of CN115809046A publication Critical patent/CN115809046A/en
Pending legal-status Critical Current

Links

Images

Abstract

The disclosure provides a data processing method, a data processing device, data processing equipment and a storage medium, and relates to the technical field of computers, in particular to the technical field of application programs and application operation. The specific implementation scheme is as follows: the JS capacity layer acquires resources required by the first application program to run so as to realize the target application function, and the JS capacity layer is embedded in the second application program; the JS capability layer generates functional data corresponding to the target application function according to the resources; and the function realization layer executes the target application function according to the function data, and is embedded in the second application program. The application program can be embedded into the native application program to run so as to realize target application functions (such as game interactive functions), and the development cost of the application program is reduced.

Description

Data processing method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to the field of application programs and application running technologies, and in particular, to a data processing method, apparatus, device, and storage medium.
Background
Currently, in the face of children users, the traditional interaction form in mobile application is complex and uninteresting. Thus, many mobile applications introduce gaming interactions for child users.
Generally, an application program introducing game interaction is realized in a form of independent application program development, so that the independent application program needs to be developed, corresponding adaptive development needs to be performed on different operating systems, and the problem of high development cost exists.
Disclosure of Invention
The present disclosure provides a data processing method, apparatus, device and storage medium, which can embed an application program into a native application program for running to implement a target application function (e.g., a game-type interactive function), and reduce the development cost of the application program.
According to a first aspect of the present disclosure, there is provided a data processing method comprising: the JS capacity layer acquires resources required by the first application program to run so as to realize the target application function, and the JS capacity layer is embedded in the second application program; the JS capability layer generates functional data corresponding to the target application function according to the resources; and the function realization layer executes the target application function according to the function data, and is embedded in the second application program.
According to a second aspect of the present disclosure, there is provided a data processing apparatus comprising: the JS performance layer is used for acquiring resources required by the first application program when the first application program runs to realize the target application function; according to the resources, generating function data corresponding to the target application function, and embedding the JS capacity layer in a second application program; and the function realization layer is used for executing the target application function according to the function data, and the function realization layer is embedded in the second application program.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as provided by the first aspect.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method provided according to the first aspect.
According to a fifth aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method provided according to the first aspect.
The present disclosure enables running an application (e.g., a first application) that implements a target application function through a JS capability layer and a function implementation layer embedded in a native application (e.g., a second application). Therefore, when the application program runs to realize the target application function, the application program can have the characteristics of the web application and is lighter. And because the application program is embedded in the native application, when the native application adopts double-end development to adapt to different operating systems, the application program can also adapt to different operating systems without carrying out independent adaptation development on the application program, thereby reducing the development cost of the application program for realizing the target application function.
It should be understood that the statements in this section are not intended to identify key or critical features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a system architecture involved in a data processing method provided by an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a data processing apparatus according to an embodiment of the disclosure;
FIG. 4 shows a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The data processing method and the data processing device provided by the disclosure are suitable for the situation that the application program is operated to realize the target application function. The data processing method provided by the present disclosure may be executed by a data processing apparatus, and the data processing apparatus may be implemented by software and/or hardware, and is specifically configured in an electronic device, where the electronic device may be a device such as a mobile terminal (e.g., a mobile phone, a tablet, etc.), a server, a computer, a vehicle-mounted device, a single chip, or other computing devices, and is not limited herein.
First, the data processing method provided by the present disclosure will be described in detail below.
Currently, in the face of children users, the traditional interaction form in mobile applications is complex and uninteresting. Thus, many mobile applications introduce gaming interactions for child users.
Generally, an application program introducing game interaction is realized in a form of independent application program development, so that the independent application program needs to be developed, corresponding adaptive development needs to be performed on different operating systems, and the problem of high development cost exists.
To this end, the present disclosure provides a data processing method, including: the JS capability layer acquires resources required by the first application program when the first application program runs to realize a target application function, and the JS capability layer is embedded in the second application program; the JS capability layer generates function data corresponding to the target application function according to the resources; and the function realization layer executes the target application function according to the function data, and is embedded in the second application program.
The present disclosure enables running an application (e.g., a first application) that implements a target application function through a JS capability layer and a function implementation layer embedded in a native application (e.g., a second application). Therefore, when the application program runs to realize the target application function, the application program can have the characteristics of the web application and is lighter. And because the application program is embedded in the native application, when the native application adopts double-end development to adapt to different operating systems, the application program can also adapt to different operating systems without carrying out independent adaptation development on the application program, thereby reducing the development cost of the application program for realizing the target application function.
Fig. 1 is a schematic flowchart of a data processing method according to an embodiment of the present disclosure. As shown in fig. 1, the method may include the following S101-S103.
S101, acquiring resources required when the first application program runs to achieve the target application function by the JS capacity layer.
The JS capability layer is embedded in the second application program and is realized through a JS (JavaScript) language.
As an example, the resources required by the first application program to execute to realize the target application function may be preset resources required to realize the target application function, and these resources may be stored locally, so that the JS capability layer may directly read the resources from the local.
Illustratively, when the target application function is a game-type interactive function, the resources can be interactive content resources (such as images, controls and the like involved in the interaction), and the resources can be stored locally in advance so as to be acquired by the JS capability layer.
Optionally, in practical applications, a resource management module may be further disposed on the JS capability layer to manage the resources. For example, the update downloading, version management, storage management, and the like of the above resources may be implemented by the resource management module. The resource management module can also be used for carrying out authentication management on the use authority of the resources, so that the authority of the resources which can be used when the user uses the application program to realize the target application function is judged according to the user account.
As another example, the resource may also be a resource that needs to be acquired when the target application function is implemented (if the target application function is to take a video or a picture, the corresponding resource is to take a video or a picture), and therefore, the JS capability layer may acquire the resource by acquiring the corresponding resource.
S102, the JS capability layer generates function data corresponding to the target application function according to the resources.
When the functional data are generated according to the resources, the function data can be realized by arranging the modules with the corresponding functions in the JS capability layer.
For example, as shown in fig. 2, modules corresponding to different service requirements such as game control, voice callback, page refresh, file reading, and credit synchronization may be set in the JS capability layer. For example, the game control module may generate corresponding game interaction function data according to the resource, so that the subsequent function implementation layer may display the corresponding game interaction function data according to the function data, thereby implementing a corresponding target application function (i.e., a game-type interaction function). For another example, the corresponding voice function data may be generated according to the resource through the voice callback module, or the audio setting may be configured, so as to implement the corresponding target application function (i.e., the voice interaction function). For another example, the refreshed page function data may be generated by the page refresh module according to the resource, so as to implement the corresponding target application function (i.e., the page refresh function). For another example, the file reading module may read resources, and the read resources (such as images and videos) are used as functional data, so as to implement corresponding target application functions (that is, functions of playing videos and images). As another example, the integration synchronization module may generate integration according to the resource, so as to implement the corresponding target application function (i.e., integration exhibition function).
And S103, the function implementation layer executes the target application function according to the function data.
Wherein the function implementation layer is embedded in the second application program.
For example, after the JS capability layer generates the function data, the function data may be sent to the function implementation module, so that the function implementation module executes the target application function according to the function data.
Optionally, in this embodiment of the present application, the function implementation layer may be configured to output function data of at least one application program, so as to present a corresponding implemented target application function to a user. For example, as shown in fig. 2, the function implementation layer may be configured to output function data corresponding to application programs for implementing different target application functions, such as a video interaction course, a voice interaction course, a live interaction course, and a follow-up reading interaction course.
Optionally, when the function implementation layer outputs the function data, the function data may be output to the corresponding first application program. For example, when the function data is display data, the function data may be displayed in an interface of the first application program. When the functional data is audio data, the functional data can be output to the electronic device through the audio playing interface of the first application program so as to enable the first application program to output the audio data.
Optionally, before the JS capability layer obtains the resources required when the first application runs to implement the target application function, the method further includes:
and the function implementation layer responds to the operation input by the user and sends an instruction for implementing the target application function to the JS capability layer, wherein the operation is used for instructing the first application program to run so as to implement the target application function.
The target application function may be an operation response made after the user's operation is performed, or may be a specific function.
Optionally, a configuration module may be further disposed in the JS capability layer, and the configuration module may store interaction rule configuration files corresponding to operations by different users, so that the JS capability layer may generate, according to the resources and the operations by the users, the function data according to the corresponding interaction rules based on the corresponding configuration files through the configuration module.
Therefore, when the user inputs the corresponding operation, the operation input by the user can be responded, and the corresponding target application function is fed back to the user.
Optionally, the function implementation layer executes the target application function according to the function data, and includes:
the function implementation layer executes the target application function according to the function data through the game engine framework.
For example, the game engine framework may employ one or more engines and functional modules such as the conventional H5 game engine.
For example, as shown in fig. 2, the game engine framework may include a webview multiplexing pool, an offline download, a transition animation, a window mode, a cos engine support, and other modules, so that the function data is processed by the game engine framework, and the function implementation layer outputs the processed function data. For example, functional data may be loaded through the webview multiplexing pool to enable output by the functional implementation layer. For another example, transition animations, window configurations, rendering resources, and the like can be downloaded and managed through the offline download module, so that it is convenient to call corresponding resources through other modules to process functional data (e.g., render, add transition animations, and the like). For another example, the function data may be processed by a transition animation to add a corresponding transition animation, so that the function implementation layer outputs a picture with the transition animation. For another example, the functional data may be processed by the window mode module to obtain functional data adapted to the window mode, so that the function implementation module can output corresponding functional data through the window module. For another example, the functional data may be processed by supporting the cocos engine, so as to implement the interactive function and then output through the function implementation layer.
Therefore, the function data can be better processed through the game engine, and the function data can be conveniently output by the function implementation module. And, better gaming interaction can be provided through the game engine framework.
Optionally, the second application includes an end capability layer, and the end capability layer is configured to provide a function module call interface of the second application; the JS capacity layer acquires resources required by the first application program to run so as to realize the target application function, and the JS capacity layer comprises the following steps:
and the JS capability layer calls the end capability layer, and resources required by the first application program when the first application program runs to realize the target application function are obtained through the function module of the second application program.
Optionally, when the resource to be acquired is a resource that needs to be acquired in time, rather than a pre-stored resource, the corresponding resource may be acquired by using the corresponding function module in the second application program in a manner of calling the end capability layer.
For example, as shown in fig. 2, modules such as a camera, a voice recording module, a file storage module, etc. may be included in the end capability layer. Therefore, when the video or the image is required to be shot through the camera to obtain the video or the image resource, the camera module in the end capability layer can be called, and the video or the image is shot through the camera module of the second application program to obtain the video or the image resource. When the voice is required to be recorded to obtain the voice resource, the voice can be recorded through the voice recording module in the calling end capability layer and the voice recording module of the second application program to obtain the voice resource. When other resources in the second application program or locally stored in the electronic device need to be acquired, the resources in the second application program or locally stored in the electronic device can be acquired through the file storage module in the calling end capability layer and the file storage module of the second application program.
Therefore, when the resources such as the shot video and the shot voice corresponding to the target application function need to be acquired, the acquisition of the corresponding resources can be realized through the end capability layer. Therefore, the development of a JS capability layer is saved, and the complex capability can be called to realize relatively complex functions.
Illustratively, the target application function is a shooting function; the JS capability layer calls the end capability layer, and the resource required when the first application runs to realize the target application function is acquired through the function module of the second application, which may include: and the JS capability layer calls the end capability layer, and resources required by the first application program when the first application program runs to realize a shooting function are obtained through the camera function module of the second application program.
In this way, when a shot video resource corresponding to the shooting function needs to be acquired, the acquisition of the corresponding resource (i.e., a video shot by a camera) can be realized through the camera function in the end capability layer. Therefore, the development of a JS capability layer is saved, and the capability of the second application program can be called to realize a relatively complex shooting function.
Optionally, the second application includes an end capability layer, and the end capability layer is configured to provide a function module call interface of the second application; the JS capability layer generates functional data corresponding to the target application function according to the resources, and the method comprises the following steps:
and the JS capability layer calls the end capability layer, and functional data corresponding to the target application function is generated according to the resources through a functional module of the second application program.
Optionally, when the acquired resource needs to be processed to generate the function data, or the function in the second application needs to be utilized to implement the target application function, the function data may be generated by utilizing a corresponding function module in the second application in a manner of calling the end capability layer.
For example, as shown in fig. 2, modules such as voice recognition, score change, sharing, and the like may be included in the end capability layer. Therefore, when the voice recognition result is required to be output by performing the voice recognition on the acquired audio resource, the voice recognition module in the end capability layer can be called, and the voice recognition module of the second application program is used for performing the voice recognition on the audio resource, so that the voice recognition result is used as functional data. When the point change is required according to the acquired resource (such as account information), the point change module in the end capability layer may be called, and new point data may be generated as the functional data according to the acquired resource through the point change module of the second application program. When the acquired resources (such as points of a user account, images saved by the user, and the like) need to be shared, a sharing module in the end capability layer may be called, and the sharing module of the second application program may generate functional data for sharing the acquired resources.
In this way, when the acquired resource needs to be processed to a certain extent to generate functional data, or when the function in the second application program needs to be utilized to realize the target application function, the processing of the corresponding resource and the generation of the functional data can be realized through the end capability layer. Therefore, the development of a JS capability layer is saved, and the complex capability can be called to realize relatively complex functions.
Illustratively, the target application function is a voice recognition function, and the resource required when the first application program runs to realize the target application function is a voice resource to be recognized; the JS capability layer may be configured to call the end capability layer, and generate, according to the resource, function data corresponding to the target application function through the function module of the second application program, where the function data may include: and the JS capability layer calls the end capability layer, and the function data corresponding to the voice recognition function is generated according to the voice resource to be recognized through the voice recognition function module of the second application program.
Thus, when the speech recognition function is realized by generating the function data after the speech recognition processing is performed on the acquired resources (such as speech resources), the speech recognition processing of the corresponding speech resources can be realized through the speech recognition function module in the end capability layer to generate the function data. Therefore, the development of a JS capability layer is saved, and the complex speech recognition capability can be called to realize the relatively complex speech recognition function.
Optionally, the second application includes an end capability layer, and the end capability layer is configured to provide a function module call interface of the second application; the function implementation layer executes the target application function according to the function data, and the function implementation layer comprises the following steps:
and the function realization layer calls the end capability layer and executes the target application function according to the function data through the function module of the second application program.
Optionally, when the functional data to be output is video data or audio data, the output of the video data or the audio data may be implemented by using a corresponding functional module in the second application program in a manner of calling the end capability layer.
For example, as shown in fig. 2, modules for video playing, audio playing, etc. may be included in the end capability layer. Therefore, when the video data needs to be played, the video playing module in the end capability layer can be called, and the video playing module of the second application program is used for playing the video data. When the audio data needs to be played, the audio playing module in the end capability layer can be called, so that the audio playing module of the second application program is used for playing the audio data.
In this way, when the functional data to be output is video data or audio data, the output of the corresponding functional data can be realized by calling the capability of the second application program. Therefore, the development of a JS capability layer is saved, and the complex capability can be called to realize relatively complex functions.
Exemplarily, the target application function is a video playing function, and the function data corresponding to the target application function is video data to be played; the function implementation layer calls the end capability layer, and executes the target application function according to the function data through the function module of the second application program, including:
the function realization layer calls the end capability layer, and executes a video playing function according to the video data to be played through the video playing function module of the second application program.
Therefore, when the functional data needing to be output is video data, the corresponding video playing function can be realized by calling the capability of the video playing function module of the second application program. Therefore, the development of a JS capability layer is saved, and the video playing capability of the second application program can be called to realize a relatively complex video playing function.
In an exemplary embodiment, an embodiment of the present disclosure further provides a data processing apparatus, which may be used to implement the data processing method described in the foregoing embodiment.
Fig. 3 is a schematic composition diagram of a data processing apparatus according to an embodiment of the present disclosure.
As shown in fig. 3, the data processing apparatus includes:
a JS capability layer 301, configured to obtain resources required when the first application runs to implement the target application function; according to the resources, generating function data corresponding to the target application function, and embedding the JS capacity layer in a second application program;
and a function implementation layer 302 for executing the target application function according to the function data, wherein the function implementation layer is embedded in the second application program.
In some possible implementations, the function implementation layer 302 is further configured to send, to the JS capability layer, an instruction to implement the target application function in response to an operation input by the user, and the operation is configured to instruct the first application program to run to implement the target application function.
In some possible implementations, the function implementation layer 302 is specifically configured to execute the target application function according to the function data through the game engine framework.
In some possible implementations, the second application includes an end capability layer 303, where the end capability layer 303 is configured to provide a function module call interface of the second application; the JS capability layer 301 is specifically configured to be a calling end capability layer 303, and obtain, through a function module of the second application, a resource required by the first application when the first application runs to implement the target application function.
In some possible implementations, the target application function is a shooting function; the JS capability layer 301 is specifically configured to invoke the end capability layer 303, and acquire, through the camera function module of the second application, a resource required by the first application when the first application runs to implement the shooting function.
In some possible implementations, the second application includes an end capability layer 303, where the end capability layer 303 is configured to provide a function module call interface of the second application; the JS capability layer 301 is specifically configured to invoke the end capability layer 303, and generate, according to the resource, functional data corresponding to the target application function through the functional module of the second application program.
In some possible implementation manners, the target application function is a voice recognition function, and the resource required when the first application program runs to realize the target application function is a voice resource to be recognized; the JS capability layer 301 is specifically configured to invoke the end capability layer 303, and generate, by using the voice recognition function module of the second application program, function data corresponding to the voice recognition function according to the voice resource to be recognized.
In some possible implementations, the second application includes an end capability layer 303, where the end capability layer 303 is configured to provide a function module call interface of the second application; the function implementation layer 302 is specifically configured to the calling end capability layer 303, and execute the target application function according to the function data through the function module of the second application program.
In some possible implementation manners, the target application function is a video playing function, and the functional data corresponding to the target application function is video data to be played; the function implementation layer 302 is specifically configured to invoke the end capability layer 303, and execute a video playing function according to video data to be played through a video playing function module of the second application program.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
In an exemplary embodiment, an electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the above embodiments.
In an exemplary embodiment, the readable storage medium may be a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method according to the above embodiment.
In an exemplary embodiment, the computer program product comprises a computer program which, when being executed by a processor, carries out the method according to the above embodiments.
FIG. 4 shows a schematic block diagram of an example electronic device 400 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, in-vehicle devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the apparatus 400 includes a computing unit 401 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the device 400 can also be stored. The computing unit 401, ROM 402, and RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
A number of components in device 400 are connected to I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, or the like; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408 such as a magnetic disk, optical disk, or the like; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
Computing unit 401 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 401 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 401 executes the respective methods and processes described above, such as a data processing method. For example, in some embodiments, the data processing method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 408. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 400 via the ROM 402 and/or the communication unit 409. When the computer program is loaded into RAM 403 and executed by computing unit 401, one or more steps of the data processing method described above may be performed. Alternatively, in other embodiments, the computing unit 401 may be configured to perform the data processing method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server combining a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.

Claims (21)

1. A data processing method, comprising:
the JS capacity layer acquires resources required by the first application program to run so as to realize the target application function, and the JS capacity layer is embedded in the second application program;
the JS capability layer generates function data corresponding to the target application function according to the resources;
and the function realization layer executes the target application function according to the function data, and is embedded in the second application program.
2. The method of claim 1, wherein before the JS capability layer obtains resources required for the first application to run to implement the target application functionality, the method further comprises:
the function implementation layer responds to the operation input by the user, sends an instruction for implementing the target application function to the JS capability layer, and the operation is used for indicating the first application program to run so as to implement the target application function.
3. The method according to claim 1 or 2, wherein the function implementation layer executes the target application function according to the function data, and comprises:
and the function implementation layer executes the target application function according to the function data through a game engine framework.
4. The method according to any one of claims 1 to 3, wherein the second application comprises an end capability layer for providing a function module call interface of the second application; the JS capability layer acquires resources required by the first application program when the first application program runs to realize the target application function, and the JS capability layer comprises the following steps:
and the JS capacity layer calls the end capacity layer, and resources required by the first application program when the first application program runs to realize the target application function are obtained through the functional module of the second application program.
5. The method of claim 4, wherein the target application function is a camera function; the JS capability layer calls the end capability layer, and the functional module of the second application program obtains the resources required by the first application program when running to realize the target application function, and the JS capability layer calls the end capability layer, and the JS capability layer comprises the following steps:
and the JS capability layer calls the end capability layer, and the first application program runs to realize the required resources during the shooting function through the camera function module of the second application program.
6. The method according to any one of claims 1 to 3, wherein the second application comprises an end capability layer for providing a function module call interface of the second application; the JS capability layer generates the function data corresponding to the target application function according to the resources, and the JS capability layer comprises the following steps:
and the JS capability layer calls the end capability layer, and the functional module of the second application program generates the functional data corresponding to the target application function according to the resources.
7. The method according to claim 6, wherein the target application function is a voice recognition function, and the resource required by the first application program to run to implement the target application function is a voice resource to be recognized; the JS capability layer calls the end capability layer, and the functional module of the second application program generates the functional data corresponding to the target application function according to the resources, wherein the JS capability layer comprises the following steps:
and the JS capability layer calls the end capability layer, and the voice recognition function module of the second application program generates the function data corresponding to the voice recognition function according to the voice resource to be recognized.
8. The method according to any one of claims 1 to 3, wherein the second application comprises an end capability layer for providing a function module call interface of the second application; the function implementation layer executes the target application function according to the function data, and the function implementation layer comprises the following steps:
and the function realization layer calls the end capability layer and executes the target application function according to the function data through the function module of the second application program.
9. The method according to claim 8, wherein the target application function is a video playing function, and the function data corresponding to the target application function is video data to be played; the function implementation layer calls the end capability layer, and executes the target application function according to the function data through a function module of the second application program, including:
and the function realization layer calls the end capability layer, and executes the video playing function according to the video data to be played through the video playing function module of the second application program.
10. A data processing apparatus, comprising:
the JS performance layer is used for acquiring resources required by the first application program when the first application program runs to realize the target application function; according to the resources, generating function data corresponding to the target application function, wherein the JS capacity layer is embedded in a second application program;
and the function realization layer is used for executing the target application function according to the function data, and the function realization layer is embedded in the second application program.
11. The apparatus of claim 10, wherein the functionality implementation layer is further configured to send an instruction to implement the target application functionality to the JS capability layer in response to an operation input by a user, the operation configured to instruct the first application to run to implement the target application functionality.
12. The apparatus according to claim 10 or 11, wherein the function implementation layer is specifically configured to execute the target application function according to the function data through a game engine framework.
13. The apparatus according to any one of claims 10 to 12, wherein the second application comprises an end capability layer, the end capability layer being configured to provide a function module call interface of the second application; and the JS capacity layer is specifically used for calling the end capacity layer, and resources required by the first application program when the first application program runs to realize the target application function are obtained through the functional module of the second application program.
14. The apparatus of claim 13, wherein the target application function is a camera function; the JS capacity layer is specifically used for calling the end capacity layer, and the first application program is operated to realize the required resources during the shooting function through the camera function module of the second application program.
15. The apparatus according to any one of claims 10 to 12, wherein the second application comprises an end capability layer, the end capability layer being configured to provide a function module call interface of the second application; and the JS capability layer is specifically used for calling the end capability layer, and the functional module of the second application program generates the functional data corresponding to the target application function according to the resources.
16. The apparatus according to claim 15, wherein the target application function is a voice recognition function, and the resource required by the first application program to implement the target application function is a voice resource to be recognized; the JS capability layer is specifically used for calling the end capability layer, and the voice recognition function module of the second application program generates the function data corresponding to the voice recognition function according to the voice resource to be recognized.
17. The apparatus according to any one of claims 10 to 12, wherein the second application comprises an end capability layer, the end capability layer being configured to provide a function module call interface of the second application; the function implementation layer is specifically configured to invoke the end capability layer, and execute the target application function according to the function data through a function module of the second application program.
18. The apparatus according to claim 8, wherein the target application function is a video playing function, and the function data corresponding to the target application function is video data to be played; the function implementation layer is specifically configured to invoke the end capability layer, and execute the video playing function according to the video data to be played through the video playing function module of the second application program.
19. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
20. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-9.
21. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-9.
CN202211518525.5A 2022-11-29 2022-11-29 Data processing method, device, equipment and storage medium Pending CN115809046A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211518525.5A CN115809046A (en) 2022-11-29 2022-11-29 Data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211518525.5A CN115809046A (en) 2022-11-29 2022-11-29 Data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115809046A true CN115809046A (en) 2023-03-17

Family

ID=85484490

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211518525.5A Pending CN115809046A (en) 2022-11-29 2022-11-29 Data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115809046A (en)

Similar Documents

Publication Publication Date Title
CN110046021B (en) Page display method, device, system, equipment and storage medium
CN110795195A (en) Webpage rendering method and device, electronic equipment and storage medium
CN109981787B (en) Method and device for displaying information
CN113453073B (en) Image rendering method and device, electronic equipment and storage medium
CN111790148B (en) Information interaction method and device in game scene and computer readable medium
CN111459364B (en) Icon updating method and device and electronic equipment
CN112882709A (en) Rendering method, device and equipment based on container engine system and storage medium
US10698744B2 (en) Enabling third parties to add effects to an application
CN114035865B (en) Method, device and equipment for starting small program and storage medium
WO2022142568A1 (en) Method and device for content displaying
CN111581664B (en) Information protection method and device
CN110288523B (en) Image generation method and device
US10328336B1 (en) Concurrent game functionality and video content
CN115809046A (en) Data processing method, device, equipment and storage medium
CN113407259B (en) Scene loading method, device, equipment and storage medium
CN113411661B (en) Method, apparatus, device, storage medium and program product for recording information
CN113836455A (en) Special effect rendering method, device, equipment, storage medium and computer program product
CN114090118A (en) Method, device and equipment for starting small program and storage medium
CN113709506A (en) Multimedia playing method, device, medium and program product based on cloud mobile phone
CN112882711A (en) Rendering method, device, equipment and storage medium
CN113327311A (en) Virtual character based display method, device, equipment and storage medium
CN114157917B (en) Video editing method and device and terminal equipment
WO2024067319A1 (en) Method and system for creating stickers from user-generated content
CN107800618B (en) Picture recommendation method and device, terminal and computer-readable storage medium
CN114095758A (en) Cloud image intercepting method, related device and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination