CN114371844B - APP development platform, APP development method and electronic equipment - Google Patents

APP development platform, APP development method and electronic equipment Download PDF

Info

Publication number
CN114371844B
CN114371844B CN202111556609.3A CN202111556609A CN114371844B CN 114371844 B CN114371844 B CN 114371844B CN 202111556609 A CN202111556609 A CN 202111556609A CN 114371844 B CN114371844 B CN 114371844B
Authority
CN
China
Prior art keywords
component
connection point
user
components
data entity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111556609.3A
Other languages
Chinese (zh)
Other versions
CN114371844A (en
Inventor
胡绍平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202111556609.3A priority Critical patent/CN114371844B/en
Publication of CN114371844A publication Critical patent/CN114371844A/en
Application granted granted Critical
Publication of CN114371844B publication Critical patent/CN114371844B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/36Software reuse
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/76Adapting program code to run in a different environment; Porting

Abstract

The application discloses an APP development method and an APP development platform, and the method comprises the following steps: a user selects a component from a component tool box, and an APP development platform is installed in the electronic equipment. The component orchestration designer may create components and display the composition of the components, which may be composed of a component body and one or more nexus components, the nexus supporting one or more data entity types. A user connecting a first component and a second component in an interface of a component orchestration designer; in response to the operation, the component design orchestrator connects the first component and the second component; a user selects a component in the component orchestration designer that has completed compilation of connections, and in response to this operation, the code generation engine compiles the plurality of completed connections displayed in the component orchestration designer into executable code for the APP. Therefore, developers can quickly combine the existing multiple components into the APP to be developed by the user, and the functional logic of the APP is realized without writing program codes one by one.

Description

APP development platform, APP development method and electronic equipment
Technical Field
The invention relates to the technical field of electronics, in particular to an APP development platform, an APP development method and electronic equipment.
Background
Currently, various electronic devices (e.g., mobile phones, tablets, computers, car machines, smartwatches, etc.) may install Applications (APPs). Developers need to develop various APPs for these electronic devices. Generally, with the same APP, developers need to write a set of program codes for electronic devices with different operating systems or different device configurations. For example, for the same video APP, a developer needs to write a set of program codes for a mobile phone of an android system, a mobile phone of an iOS system, a computer, and the like, and a code written by the developer for one electronic device cannot be directly multiplexed into another electronic device of a different operating system.
In addition, the program codes of many APPs use different programming languages, so that developers cannot reuse the program codes of existing APPs to realize a certain function when developing the APPs. For example, the existing photographing APP and beauty APP have photographing functions. If a developer needs to develop an APP with a photographing function, the developer cannot directly reuse program codes of the part with the photographing function in the existing APP due to the fact that programming languages or interfaces used by the existing APP with the photographing function are different. Thus, a developer cannot utilize program codes for realizing certain functions in the existing APP to conveniently develop a new APP, and much time is consumed for developing an APP.
Therefore, how to improve the efficiency of developing the APP by the development platform and rapidly develop a new APP is an urgent problem to be solved.
Disclosure of Invention
The application provides an APP development platform, an APP development method and electronic equipment. The APP development platform that this application provided, the user can select the subassembly to connect in APP development platform, and APP development platform can verify whether the subassembly of user's selection can connect. After the components selected by the user are successfully connected, the user can select to compile the successfully connected components into APP executable source code in the APP development platform. In this way, the user can develop the APP quickly.
In a first aspect, the application provides an APP development platform applied to an electronic device, the APP development platform comprises a component tool box, a component layout designer and a code generation engine. The assembly tool box is used for providing an assembly which is an independent module for realizing a specific function, the assembly consists of an assembly main body and one or more connection points, and the connection points support one or more data entity types; the component arrangement designer is used for displaying components, and two or more components are connected according to the operation of the user connection component; the code generation engine is used for generating the two or more components connected in the component orchestration designer into source code executable by a first APP, and the first APP comprises the two or more components.
Wherein data entities, i.e. data that a connection point can support. The data entity type is the type of the data entity. The data entity types may include various types of audio, video, image, text, and so on.
Through the APP development platform that this application provided, the user selects the subassembly from the subassembly toolbox, installs APP development platform in the electronic equipment. The component orchestration designer may create components and display the composition of the components, which may consist of a component body and one or more nexus components. A user connecting a first component and a second component in an interface of a component orchestration designer; in response to the operation, the component design orchestrator connects the first component and the second component; a user selects a component in the component orchestration designer for which to compile a connection, and in response to this operation, the code generation engine compiles a plurality of connected components displayed in the component orchestration designer into executable code for the APP. Therefore, developers can quickly combine the existing multiple components into the APP to be developed by the user, and the functional logic of the APP is realized without writing program codes one by one.
In one possible implementation, the component orchestration designer is further to: verifying whether the first component and the second component are matched in response to an operation of connecting the first component and the second component by a user; and if the first assembly is matched with the second assembly, connecting a first connecting point and a second connecting point, wherein the first connecting point is the connecting point of the first assembly, and the second connecting point is the connecting point of the second assembly. In this way, a mismatch in data formats of the first component and the second component may be avoided.
In one possible implementation, the matching of the first component and the second component includes: the first data entity type is the same as the second data entity type, the first data entity type comprises the second data entity type, or the second data entity type comprises the first data entity type, wherein the first data entity type is the type of the data entity supported by the first connection point, and the second data entity is the type of the data entity supported by the second connection point.
In one possible implementation, the component toolkit is further configured to: in response to an operation of a user uploading or downloading a component from the component market, a name of the component uploaded or downloaded by the user from the component market is displayed. Therefore, the APP platform can provide more components for the user, and user experience is improved.
In one possible implementation, the component orchestration designer is specifically configured to: showing the connection lines connecting the first connection point and the second connection point. In this way, the user may be prompted that the first connection point and the second connection point are successfully connected.
In one possible implementation, the component orchestration designer is specifically configured to: the first connection point and the second connection point are displayed in an overlapping manner. In this way, the user may be prompted that the first connection point and the second connection point are successfully connected.
In one possible implementation, the component orchestration designer is specifically configured to: the first component is displayed in accordance with an operation of the user selecting the first component from the component toolbox.
In one possible implementation, the component orchestration designer is further to: responding to the operation of intelligently arranging the first component selected by the user, and displaying a component calling tree of the first component; the component call tree is used for showing a second component and/or a third component matched with the first component, a fourth component and/or a fifth component matched with the second component, and the nth component matched with the mth component, wherein the nth component is a component without an output connection point, and M and N are positive integers. Therefore, when the user does not know the component matched with the first component, the component which can be connected with the first component can be intelligently recommended to the user, the time of the user is saved, and the user experience is improved.
In a possible implementation manner, the displaying of the component call tree of the first component specifically includes: the component call tree of the first component is displayed according to the function of the first component and/or the type of data entity supported by the connection point of the first component. In this way, the component that the first component matches can be recommended more accurately.
In one possible implementation, the component orchestration designer is further to: in response to a user delete operation of a component, the delete component invokes the component in the tree.
In one possible implementation, the component orchestration designer is further to: saving an arrangement model map of two or more assemblies which are connected and first information in the arrangement model map; the first information comprises one or more of an ID, a name of two or more components in the first information, and a data entity type supported by a connection point of the two or more components.
In one possible implementation, the code generation engine is specifically configured to: and generating a first APP executable source code according to the arrangement model diagram, the first information and the component calling template, wherein the component calling template comprises a program code with a preset format. The component calling template is a code template preset according to different types of components and the attributes of connection points, and encapsulates a universal interface calling code logic.
In a second aspect, the present application provides an APP development method, including: in response to an operation of a user selecting a component from a component toolkit of an electronic device, the electronic device displaying the component in a component orchestration designer, the component being a stand-alone module that implements a specific function; the component is composed of a component body and one or more connection points, and the connection points support one or more data entity types; in response to a user operation to connect a plurality of components, the electronic device connects two or more components in the component orchestration designer; in response to a user selecting operation of the compiled two or more components, the electronic device generates executable source code of the first APP from the two or more components to be connected in the code generation engine.
Wherein, the data entity is data which can be supported by the connection point. The data entity type is the type of the data entity. The data entity types may include various types of audio, video, image, text, and so on.
Therefore, the user can obtain the APP by utilizing the existing component connection without rewriting the codes of the APP, and the time for developing the APP by the user can be saved.
In one possible implementation, the electronic device connects two or more components in the component orchestration designer, specifically including: in response to an operation of connecting the first component and the second component by a user, the electronic device verifying, by the component orchestration designer, whether the first component and the second component match; if the first component and the second component are matched, the electronic equipment is connected with a first connection point and a second connection point, the first connection point is a connection point of the first component, and the second connection point is a connection point of the second component. In this way, it can be ensured that the two components connected by the user are matched.
In one possible implementation, the matching of the first component and the second component includes: the first data entity type is the same as the second data entity type, the first data entity type comprises the second data entity type, the second data entity type comprises the first data entity type, the first data entity type is the type of the data entity supported by the first connection point, and the second data entity is the type of the data entity supported by the second connection point.
In one possible implementation, the method further includes: in response to a user operation to view the first connection point properties of the first component, the electronic device displays, in the component orchestration designer, data entity types supported by the first connection point. In this way, the user can know the property of the first connection point of the first component to facilitate the user's subsequent operations, such as searching for a second component matching the first component based on the property of the first connection point.
In one possible implementation, the electronic device connecting the first connection point and the second connection point includes: displaying a connecting line connecting the first connecting point and the second connecting point; alternatively, the first connection point and the second connection point are overlapped. In this way, the user may be prompted that the first connection point and the second connection point have established a connection.
In one possible implementation, the method further includes: responding to the operation of intelligent arrangement of the first component selected by a user, and displaying a component calling tree of the first component in a component arrangement designer by the electronic equipment; the component call tree is used for showing a second component and/or a third component matched with the first component, a fourth component and/or a fifth component matched with the second component, and an Nth component matched with the Mth component, wherein the Nth component is a component without an output connection point, and M and N are positive integers. Therefore, when the user does not know the component matched with the first component, the component which can be connected with the first component can be intelligently recommended to the user, the time of the user is saved, and the user experience is improved.
In a possible implementation manner, the electronic device displays a component call tree of a first component in a component orchestration designer, specifically including: the component call tree of the first component is displayed according to the function of the first component and/or the type of data entity supported by the connection point of the first component. In this way, the component that the first component matches can be recommended more accurately.
In one possible implementation, the method further includes: in response to the user operation of deleting the component, the electronic device deletes the component and calls the component in the tree. In this way, the user can delete components in the component call tree that are not related to the first APP.
In one possible implementation, the method further includes: the electronic device stores an arrangement model diagram of two or more components completing connection and first information in the arrangement model diagram in a component arrangement designer; the first information comprises one or more of an ID, a name of two or more components in the first information, and a data entity type supported by a connection point of the two or more components.
In one possible implementation, the method further includes: and the electronic equipment generates a first APP executable source code in the code generation engine according to the arrangement model diagram, the first information and the component calling template, wherein the component calling template comprises a program code with a preset format. The component calling template is a code template preset according to different types of components and the attributes of connection points, and encapsulates a universal interface calling code logic.
In a third aspect, the present application provides an electronic device, comprising: one or more processors, one or more memories; the one or more memories are respectively coupled with the one or more processors; the one or more memories for storing computer program code, the computer program code including computer instructions; when the computer instructions are executed on the processor, the electronic device is caused to execute the APP development method in any one of the possible implementations of the above aspects.
In a fourth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are run on an electronic device, a communication apparatus is caused to execute an APP development method in any one of the possible implementation manners of the foregoing aspects.
In a fifth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the APP development method in any one of the possible implementations of any one of the above aspects.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be described below.
Fig. 1A is a schematic diagram of a video decoding component according to an embodiment of the present application;
FIG. 1B is a schematic view of a composite assembly according to an embodiment of the present disclosure;
fig. 2 is a schematic architecture diagram of an APP development platform 10 according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a kit of parts 101 provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIGS. 5A-5G are schematic diagrams of user interfaces provided by embodiments of the present application;
6A-6B are schematic diagrams of user interfaces provided by embodiments of the present application;
7A-7B are schematic diagrams of user interfaces provided by embodiments of the present application;
FIGS. 8A-8D provide a schematic illustration of a user interface according to an embodiment of the present application;
fig. 9 is a schematic flowchart of an APP development method according to an embodiment of the present application;
fig. 10 is a schematic view of an application scenario provided in an embodiment of the present application;
FIG. 11 is an ecological schematic diagram of component development provided in an embodiment of the present application;
FIG. 12 is a diagram illustrating a domain description language and component graph mapping relationship of components according to an embodiment of the present application;
fig. 13 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present application;
fig. 14 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more. Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
First, some concepts (e.g., Component, connection point) related to the embodiments of the present application will be described.
(1) Component (Component)
A component is an independent module consisting of certain business logic, and related data. A component may perform a particular function. For example, the video playing component may perform a function of playing video, and the video decoding component may perform a function of decoding video. An APP may be composed of one or more components. The components may or may not have user interfaces. For example, the video decoding component may not have a user interface, i.e., the process of the electronic device decoding the video through the video decoding component may not be presented in the user interface without being perceived by the user. The individual components may run independently in the user device, for example, a play video component may be installed in the TV, and then the TV plays video through the play video component.
Each component includes a component body and a number of connection points (input connection points or output connection points). One component enables the transfer of data with other components through connection points. Generally, a component includes at least one input connection point or output connection point. It will be appreciated that there may be components that do not have connection points, i.e., there are no input connection points and no output connection points for such components. In the embodiments of the present application, the components that may be used to form the APP have at least one or more connection points.
Illustratively, the composition of one component may be as shown in FIG. 1A. Fig. 1A shows a graphical example of a video decoding component. The video decoding component may comprise a component body (e.g., decoded video 10a in fig. 1A) and one Input connection point (e.g., Input connection point10b in fig. 1A), and two Output connection points (e.g., Output connection point10c and Output connection point10 d in fig. 1A).
(2) Connection point (ConnectPoint)
The connection point is an external intelligent interface of the component, namely an agent for realizing input or output functions and is responsible for interface protocol check, negotiation butt joint, data transmission and the like with other connection points. A connection point is a way for a component to receive input data from another component and also a way for a component to output data to another component. The connection points include input connection points or output connection points. In general, each nexus may support one or more types of data Entity entitys. The data entities supported by the connection point may be images, audio, video, text, etc.
(3) Data Entity (Entity)
Data entities, i.e. data that a connection point can support. The data entity type is a type of data entity, and the data entity has various types, such as audio, video, image, text, and the like. As shown in table 1, table 1 illustrates some types of data entities. Where video may be classified according to different video coding standards. For example, video coding standards may include Moving Pictures Experts Group (MPEG) compression coding standards and High Efficiency Video Coding (HEVC) standards. The MPEG compression encoding standard may include several encoding standards such as MPEG, MPEG2, and MPEG 4. HEVC may contain coding standard H265. The file of audio may be in a variety of formats. For example, an Audio file in a moving picture experts group Audio layer III (MP 3) format, an Audio file in a Waveform Audio (WAVE) format, and an Audio file in a Lossless Audio Codec (FLAC) format. The types of data entities may also include text, pictures, compressed files, documents, streams, datasets, and so forth. The stream may comprise a video stream, an audio stream, or a composite stream. The composite stream may include a video stream and an audio stream. The data set may include database tables. Each type may also include some sub-type, format, or some other attribute. For example, the video type data entity types may be sub-types of MPEG, MPEG2, and MPEG4, and may further include: width, height, subtitle, language, etc. For example, the data entity type supported by the input connection point10b in fig. 1A may be video in MPEG format, 1920 × 1080 format, chinese language, and so on. Specifically, refer to table 1, which is not described herein again.
TABLE 1
Figure BDA0003419217490000061
Figure BDA0003419217490000071
(4) Composite assembly
The composite component is composed of a plurality of components and can complete a plurality of functions. For example, as shown in FIG. 1B, composite component 100A may be comprised of a video decoding component and a video playback component. Wherein the video decoding component comprises a body (decoded video 10a), an input connection point (input connection point10b) and two output connection points (output connection point10c and output connection point10 d). The video playback assembly includes a main body (playback video 10e) and an input connection point (input connection point10 f). The output connection point10 d of the video decoding component is connected to the input connection point10 f of the video playing component. Composite component 100A may implement video decoding and video playback functions.
In order to solve the problem that developing the APP is time-consuming, in one implementation manner, a developer can combine an existing shortcut instruction or a custom shortcut instruction into a new APP through a "shortcut instruction" application on the electronic device. That is, the developer can combine multiple operations among multiple applications in the terminal through a 'shortcut instruction' application to create an APP. For example, the application a in the electronic device has the operation of taking a picture. The application B in the electronic device has an operation of converting the photograph into a PDF document with one key. A developer can serially combine the photographing operation in the application A and the operation of converting a photo into a PDF document by one key in the application B into a new APP through a 'shortcut instruction'. Thus, a developer can develop a new APP relatively quickly.
However, the methods proposed in this implementation can only simply combine some operations in the electronic device. When a developer needs to develop an APP with more functions and more complex business logic, the method provided in the implementation manner is difficult to meet the requirements of the developer.
To the problems existing in the implementation modes, the application provides an APP development platform. The APP development platform comprises a component tool box, a component layout designer and a code generation engine. Based on the APP development platform, the application provides an APP development method. The method comprises the following steps: the electronic equipment detects the operation of a user selecting a component from the component toolbox, and an APP development platform is installed in the electronic equipment. In response to this operation, the component orchestration designer may create a component and display the composition of the component, which may be composed of a component body and one or more nexus components. The electronic device detecting an operation of a user connecting a first component and a second component in an interface of a component orchestration designer; in response to the operation, the component design orchestrator determines whether the data entity types supported by the output nexus of the first component and the data entity types supported by the input nexus of the second component match. If so, the component designer composer may display an indication that the connection was successful. The electronic device detects an operation of a user selecting a compiled connected component in the component orchestration designer, in response to which the code generation engine compiles a plurality of connected completed components displayed in the component orchestration designer into executable code of the APP. Therefore, the development personnel can quickly combine the existing multiple components into the APP which needs to be developed by the user, and the functional logic of the APP is realized without writing program codes one by one.
In this embodiment, a developer may also be referred to as a user, and the user may develop an APP or a component in the electronic device provided in this application.
First, an APP development platform provided by the embodiment of the present application is introduced. As shown in fig. 2, fig. 2 shows an architecture diagram of an APP development platform 10 provided in the embodiment of the present application. The APP development platform 10 provided by the embodiment of the present application includes: a component toolkit 101, a component orchestration designer 102, and a code generation engine 103.
The component toolkit 101 is used to present components. The components in the component toolkit 101 may be categorized by the function of the component. The user may download components from the component marketplace and then save them in the component toolkit 101. The user can also upload components that have been developed and designed by the user into the component market.
Illustratively, the component tool box 101 may be as shown in fig. 3, and fig. 3 shows the component tool box 101 provided by the embodiment of the present application. The component toolkit 101 may be displayed in a display area 1000 of a user interface (not shown). A control 1001, a control 1002, a control 1003, and a control 1004 may be included in the display area 1000. The user may search for components in the component toolkit 101 through the control 1001. Controls 1002 and 1003 are used to expand or collapse a certain type of component. For example, control 1002 in FIG. 3 is used to expand or collapse components of a common component class. The control 1003 is used to expand or retract the components of the video playback class. Fig. 3 shows an exemplary component toolbox 101 containing components of a common component class and components of a video and audio playing class. It is understood that the components in the component tool box 101 are not limited to components of the common component class and components of the audiovisual playback class. Other kinds of components may also be included in the component toolkit 101, such as components of document processing classes, components of picture processing classes, and so forth. It is understood that the embodiment of the present application does not limit the specific user interface of the component tool box 101. There may be more or fewer controls in the user interface of the component toolkit 101 than in fig. 3.
In some embodiments, the user may sort the components according to their usage habits, for example, the user may add a play video component to a category of commonly used components. The user can divide the reservation air ticket component, the reservation hotel component and the reservation takeout into the reservation payment component, and the specific classification name can be defined by the user. It will be appreciated that the component categories in the component list in the component toolkit 101 may be different among different users. For example, the components of the common component class (e.g., send sms component, call component, etc.), the video and audio playing class (e.g., play video component, decode audio component, etc.), and so on shown in fig. 3. The embodiments of the present application are not limited with respect to how components are classified. The embodiments of the present application will be described below by taking the classification of components into examples according to their functions.
The component orchestration designer 102 is a core tool for component orchestration and development, and specifically, a developer can select, lay out, and connect a plurality of components, select a connection point data entity type, set business logic, form a new composite component, and the like in the orchestration designer. The component orchestration designer may be specifically configured to:
a display generation component: for example, in response to a user dragging a component from the component toolkit 101 into the component orchestration designer 102, the component orchestration designer 102 may present the component to the user. Specifically, a developer may select a component in the component toolkit 101 and the component orchestration designer 102 reads the file for the component. The component orchestration designer 102 obtains the composition of the component from the component's file and draws the component for presentation in the component orchestration designer 102.
Connection point butt joint verification: verifying whether the data entity types of the connection points of the two components are consistent;
and (3) connecting the components: two or more components are connected according to a user operation. Illustratively, a developer may connect two or more components in the component orchestration designer 102 according to the business logic of the desired APP or the business logic of the conforming component. As the two components are connected, the component orchestration designer 102 may also verify that the data types supported by the connection points of the two components are consistent. For example, when the output connection point of the video playing component and the input connection point of the video decoding component support the same data entity type, the connection point between the two components may be connected;
and (3) business logic arrangement: the developer can add logic judgment or circulation to the connection point according to the data entity type supported by the component connection point;
intelligent arrangement: the component orchestration designer 102 may automatically associate and recommend other components that may interface with the current component for selection by the developer, based on the functionality of the component and the type of data entity supported by the component connection points; or automatically generating the docking arrangement model according to the arrangement strategy. Illustratively, the component orchestration designer 102 may display all components that can connect to the connection point of the first component according to a user operation. The user may select a desired component among the components displayed by the component orchestration designer 102.
Support for viewing connection point attributes: the component orchestration designer 102 may display the data types supported by a connection point of a component in response to a user operation to view the data types supported by the connection point.
Saving a component arrangement model diagram: illustratively, the component orchestration designer 102 may also be used to save a model graph for which multiple component orchestrations are completed, as well as all information in the model graph. For example, the ID and name of all components in the model map, the data entity type of all connection points, the connection attribute of the connection point (including the data entity type supported by two connected connection points, and the data transmission mode (direct transmission or transmission requiring judgment condition and transmission meeting judgment condition)).
The code generation engine 103 is used for generating the edited composite component and APP into executable source code. Specifically, the code generation engine 103 generates executable source code according to the model diagram completed by the component arrangement stored in the component arrangement designer 102, and all information in the model diagram, for example, IDs and names of all components in the model diagram, data entity types of all connection points, connection attributes of the connection points, and a component call template. The component calling template is a code template preset according to different types of components and the attributes of connection points, and encapsulates a universal interface calling code logic. Illustratively, the component invocation template may include the following program code:
Figure BDA0003419217490000091
Figure BDA0003419217490000101
Figure BDA0003419217490000111
the codes are templates except the content in < >. The code generation engine can replace the content of < > in the code according to the specifically generated APP. For example, "ComposePlayer" in < ComposePlayer > may be replaced with the name of a particular APP (e.g., 123 player). "videoplug" in < videoplug > may be replaced with the component name that is actually needed to generate the APP. "ConnectPoint 1" in < ConnectPoint1> may be replaced with the actual connection point. "MPEG" and "640 _ 480" in < entity.mpeg >, < entity.640_480> may be replaced with actual data entity types.
It can be understood that the APP development platform in the present application may be used as a tool for developing an APP alone, or may be a module in a development tool. And is not limited herein.
The following describes an implementation process of developing an APP by using an APP development platform provided in an embodiment of the present application through a drawing.
Fig. 4-8D exemplarily show a process of developing a video APP. The embodiment of the present application is described by taking an example that the electronic device is a computer.
As shown in FIG. 4, FIG. 4 illustrates a user interface 400. The user interface 400 is used for a user to create an APP. Input box 401, input box 402, input box 403, input box 405, and controls 404, 406, 407, 408, 409, 410 may be included in user interface 400. Among them, the input box 401 is used to input a user-created item name, i.e., an APP name, such as "MediaPlayerAPP". The input box 402 is used for the user to input a Package Name (Package Name) of the APP created by the user, such as "com. The package name is the unique identifier of the APP and is mainly used for identifying the APP by the system. The input box 403 is used for the user to select or input the storage location of the created APP. The user may enter the storage location of the created APP directly in input box 403, such as "D: \ test \ media layerapp". The user may also select the storage location of the created APP via control 404 in input box 403. The input box 405 is used for the user to select or directly input the lowest version of the API supported by the created APP. The user may directly enter the API minimum version of the created APP support, e.g., "xxsdk: 1.0.0", in output box 405. The user may also select the lowest version of the API supported by the created APP via control 406. Controls 407 are used to guide the user how to operate in user interface 400. Control 408 is used to cancel the user created item. Control 409 is used to return to the previous operation. Controls 410 are used to save user created items and refresh user interface 400. When the user has filled in the content in input boxes 401, 402, 403, 405, control 410 is clicked.
Illustratively, in response to a user clicking on control 410, user interface 50A is displayed as shown in FIG. 5A. The user interface 50A may include a display area 501 and a display area 502. The display area 501 is used to display the component toolkit 101. When a user selects a component (e.g., a split video component) in the component toolkit 101 to drag into the display area 502, the display area 502 may display a graphic of the component. Here, the display area 502 may be an interface of the component design orchestrator 102. The interface of the component design orchestrator 102 may be used to display a graphic of the component.
In one possible implementation, when the electronic device detects a user operation dragging a split video component in the component toolkit 101, in response to the user operation, the component orchestration designer 102 may create the split video component and draw the main body of the split video component (e.g., the split video 503 in the user interface 50A) and the nexuses (e.g., the input nexus 504 and the output nexus 505 and the output nexus 506 in the user interface 50A) in the display area 502.
The user can view the data entity types supported by the component connection point. The electronic device may detect an operation of the user to view the connection point and display the data entity types supported by the connection of the component for the user. The operation may be of various types, such as double-clicking on the connection point, hovering the mouse cursor over the connection point for 2s, or right-clicking the connection point with the mouse, etc., which are not limited herein. As shown in FIG. 5B, the user can right-click on output nexus 506, the electronic device detects the user action, and a user interface 50C, such as that shown in FIG. 5C, is displayed, and a view control 507 and an intelligent orchestration control 508 can be displayed in user interface 50C. View control 507 is used to display the data entity types supported by output nexus 506. Intelligent orchestration control 508 may be used for intelligent orchestration, displaying components that can connect with output connection points 506. Illustratively, the user may click on the view control 507.
Illustratively, in response to the user clicking on the view control 507, a user interface 50D as shown in fig. 5D is displayed, in which user interface 50D a data entity attribute table 509 supported by the output connection point 506 may be displayed. The data entity attribute table 509 may be used to display the data entities supported by the output connection point 506 as video, the sub-type of video may be MPEG and MPEG4, and the format of video may be 640 × 480 and 1920 × 1080, etc.
In one possible implementation, the user may also view the data entity types that are individually supported by all the connection points of the decomposed video component by clicking on the component body decomposed video 503 of the decomposed video component in the user interface 50D. The way of viewing the data entity types supported by all the connection points of the decomposed video component can be to double click the connection points, hover the mouse cursor over the connection points for 2s, or right click the mouse on the connection points, etc., which is not limited herein. Illustratively, in response to the user clicking on the component body decomposed video 503 operation of the decomposed video component, the electronic device may display a user interface 50E as shown in fig. 5E. A property table 510 of the decomposed video component may be presented in the user interface 50E. The properties table 510 is used to show the properties of all connection points of the decomposed video component. Connection point1 in attribute table 510 may be input connection point 504, connection point2 may be output connection point 505, and connection point 3 may be output connection point 506. The connection point1 may contain two data entities, namely data entity 0 and data entity 1. Control 511 is used to hide the data entities of the connection point. That is, the user clicks on control 511, and data entity 0, data entity1, and data entity shown in property table 510 are hidden. The control 512 is used to hide the properties of the data entity 0, such as type (category), sub-type (sub-category), width (width), height (height), bit rate (bitrate), and so on. The control 513 is used to expand the properties of the display data entity 1. Control 514 is used to expand and display the data entities supported by nexus 2. The control 515 is used to expand the data entities supported by the display connection point 3.
When the connection point supports multiple types of data entities, the user may select one data entity to be the data entity supportable by the connection point. For example, in the user interface 50F shown in FIG. 5F, the output connection point 506 supports multiple types of data entities. Specifically, the data entities supported by the output connection point 506 may be video data with a format of 640 × 480, video data with a format of 1920 × 1080, and video data with a format of 1920 × 480, with a format of MPEG4, and video data with a file type of MPEG4 and a format of 1920 × 1080. The user may set MPEG4 in data entity attribute table 509 as a subtype that output connection point 506 can support. The electronic device may detect an operation of the user setting the supporting data entity property of the connection point. There are various operations that the user may set the connection point for the supporting data entity attribute, for example, the user may double click on the subtype MPEG4 in the data entity attribute table 509. The operation of setting the supporting data entity attribute of the connection point by the user is not limited herein. In response to the user operation, the electronic device may display a prompt box 516. Prompt box 516 is used to prompt the user that a data entity has been set. The prompt content of prompt box 516 may be the text "you have set the subtype of the connection point to MPEG 4" shown in FIG. 5E, where the specific prompt content of prompt box 516 is not limited.
In another possible implementation, the user may set the data entities supported by the connection point in the attribute table 510 shown in fig. 5E. For example, a user may double-click on data entity 0 in attribute table 510, and in response to a user action, component orchestration designer 102 sets data entity 0 to be the only supported data entity at nexus 1.
The user may in turn set the sub-types and formats of data entities (e.g., video) supported by the output connection point 506. For example, the user may set the subtype of the video to MPEG4 with a format set to 1920 × 1080. The user may again click on view control 507 to view the properties of the data entities supported by output nexus 506. As shown in user interface 50G shown in FIG. 5G, user interface 50G displays an updated data entity attribute table 511. The subtype and format in the data entity attribute table 511 are both user-defined, i.e. the subtype is MPEG4 and the format is 1920 × 1080.
Similarly, the user may add additional components in the component toolkit 101 to the component orchestration designer 102 as described above. Illustratively, a user may drag a play video component in the component toolkit 101 into the component orchestration designer 102. The electronic device may detect an operation by the user dragging the play video component, in response to which the component orchestration designer 102 may create the video play component and draw the component body of the video play component (e.g., play video 601 shown in user interface 60A) and the nexus (e.g., input nexus 602 shown in user interface 60A) and display in the interface of the component orchestration designer 102 (e.g., user interface 60A).
As shown in fig. 6A, the user may hide the component toolbox. And the user may connect the two components in the interface of the component orchestration designer 102. For example, as shown in fig. 6A as user interface 60A. The electronic device can detect that the user connects the split video component with the play video component. There are various operations that a user may connect two components, for example, the user may drag output nexus 506 to input nexus 602, or the user may drag input nexus 602 to output nexus 506, but not limited thereto. In response to the user action, component orchestration designer 102 obtains the data entity types supported by output nexus 506 and input nexus 602. Component orchestration designer 102 determines whether the data entity types supported by output nexus 506 match the data entity types supported by input nexus 602. Matching the data entity type supported by the egress connection point 506 with the data entity type supported by the ingress connection point 602 means that the data entity type supported by the connection point 506 is the same as the data entity type supported by the ingress connection point 602, or at least one of the data entity types supported by the egress connection point 506 is the same as the data entity type supported by the ingress connection point 602. If so, an indication of successful connection may be displayed in the interface of the component orchestration designer 102. If not, the user may be prompted for a mismatch in the interface of the component orchestration designer 102 (not shown). There are various ways to prompt the unmatching of the connection points (e.g., output connection point 506 and input connection point 602) of the two components, for example, the prompt words "no connection", "failed matching", "failed connection", "inconsistent data entity types", etc. are displayed in the user interface, which is not limited herein. In such an implementation, even if the input connection point or the output connection point of a component supports multiple data types, the orchestrator may connect the input connection point and the output connection point according to the fact that both connection points support the same data entity type.
For example, in another possible implementation, when a connection point supports multiple entity types, the user may set the data entity type supported by the output connection point 506 and the data entity type supported by the input connection point 602 to be the same type, and then connect the output connection point 506 and the input connection point 602. At this point, component orchestration designer 102 may connect output nexus 506 with input nexus 602, since output nexus 506 and input nexus 602 support the same type of data entity.
The user interface for the connection completion of the two components may be as shown in fig. 6A. Fig. 6 illustrates a user interface 60A provided by an embodiment of the present application. The output connection point 506 of the split video component and the input connection point 602 of the playing video in the user interface 60A are successfully connected. The indication of connection success displayed in the interface of component orchestration designer 102 may be of various types, such as a connection line 603 between output connection point 506 and input connection point 602 shown in user interface 60A, and is not limited herein.
For example, in one possible approach, the user interface after the connection of the two components is completed may also be as shown in fig. 6B. Fig. 6B shows user interface 60B with output connection point 506 of the split video component successfully connected to input connection point 602 of the playing video. Output connection point 506 and input connection point 602 fold or overlap together, i.e., indicating that output connection point 506 and input connection point 602 are successfully connected.
After the user selects all components required by the created APP, in response to the connection operation of the user, the component layout designer 102 may connect the components, and for a specific connection process, reference may be made to the above description of the connection between the decomposed video component and the video playing component, which is not described herein again. The user interface after the connection is completed may be as shown in fig. 7A. Fig. 7A illustrates a user interface 700 provided by the present application. All components that make up the user-created APP (i.e., the video playback APP) are exposed in the user interface 700. All components of the video playback APP may include a read video file component 702, a decompose video component 705, a speech translation component 710, a high definition enhancement component 711, an audio playback component 716, and a video playback component 717. Wherein the read video file component may include an input connection point 701 and an output connection point 703. Decomposed video component 705 may include input connection point 704, output connection point 706, output connection point 707. Speech translation component 710 can include an input connection point 709 and an output connection point 712. High definition enhancement component 711 may include an input connection point 708 and an output connection point 713. The audio playback component 716 may include an input connection point 714. The video playback component 717 can include an input connection point 715. The output connection point 703 is connected to the possible input connection point 704. Output connection point 707 may be connected to input connection point 709 or input connection point 714. The output connection point 706 may be connected to the input connection point 708 or the input connection point 715. Output connection 712 may be connected to input connection 714. The output connection point 713 may be connected to the input connection point 715.
In an embodiment, the user may set properties to the connection lines between the connection points. The properties of the connection line may include a logic control property (conditional/branch/loop) or a data transfer property (input/output of Entity). In fig. 7A, the output connection 707 of the decomposed video component 705 can be determined as the language attribute, and if the language attribute is consistent with the local language, the speech translation component 710 can be skipped and directly connected to the input connection 714 of the sound playing component. That is, the user can set a judgment condition for the connection point 707, the judgment condition being whether the language supported by the connection point 707 coincides with the local language of the electronic apparatus. If so, the output of the video decomposition component is input to the input connection 714 of the audio playback component via the output connection 707. If not, the output of the video decomposition component is input to the input connection point 709 of the language translation component 710 through output connection point 707.
Illustratively, the user may click on the connection point on the left end of the connection line to set the connection line properties. Such as the connection line between connection point 707 to connection point 714 and the connection line between connection point 707 to connection point 709 shown in fig. 7A. The user may click on nexus 707 to set the control properties of nexus 707 to nexus 714 and the nexus between nexus 707 to nexus 709. The interface for setting the control properties of the connecting lines may be as shown in user interface 70B in FIG. 7B. The user interface 70B may include a condition setting box 720, a condition setting box 722, a condition setting box 724, a condition judgment result selection box 725, a control 721, a control 723, a control 726, a control 727, a control 728, and a control 729. The user interface 70B may further include a condition setting box 730, a condition setting box 732, a condition setting box 734, a condition determination result selection box 735, a control 731, a control 733, a control 736, and a control 737.
The condition setting box 720 is used for setting a judgment subject, such as language (language). A control 721 is used to select the condition for the judgment. The condition setting block 722 is used to set the relationship between the judgment subject and the comparison subject, for example, the judgment subject may contain the comparison subject. The control 723 is used to select the relationship of the judgment subject and the comparison subject. For example, the relationship may be inclusive or exclusive, and is not limited thereto. The condition setting box 724 is used to set a comparison subject, such as a local language. The condition setting block 720, the condition setting block 722, and the condition setting block 724 collectively complete determination condition setting. For example, the determination condition 1 shown in the user interface 70B is "the language of the connection point 707 includes the local language". The condition judgment result selection box 725 is for setting a condition judgment result. If, for example, the condition "the language of the connection point 707 includes a local language" is satisfied, the data entity of the connection point 707 is transmitted to the connection point 714 via the connection line between the connection point 707 and the connection point 714. The control 726 is used to select a conditional judgment result. Control 727 is used to increase the condition. Control 728 is used to complete condition setup. A control 729 is used to cancel the condition setting. The control 737 is used to return to the previous level of interface. Similarly, the condition setting box 730 is used to set a judgment subject such as language (language). Control 731 is used to select the conditions for the decision. The condition setting block 732 is used to set the relationship of the judgment subject and the comparison subject. The control 733 is used to select a relationship between the judgment subject and the comparison subject. The condition setting block 734 is used to set a comparison subject, such as a local language. The condition setting block 730, the condition setting block 732, and the condition setting block 734 collectively complete the determination condition setting. For example, the determination condition 2 shown in the user interface 70B is "the language of the connection point 707 does not contain the local language". The condition determination result selection box 735 is used to set a condition determination result. For example, if the condition "the language of the connection point 707 does not contain the local language" is satisfied, then the data entity of the connection point 707 is transmitted to the connection point 709 via the connection line between the connection point 707 and the connection point 709. The control 736 is used to select the conditional result.
After all the components in fig. 7 are connected, the electronic device may detect an operation of compiling the connected components into executable code by the user. There are many types of such user operations, such as the user clicking on the compilation control 718 of the user interface 700 in FIG. 7A, and not limited herein. In response to the user operation, code generation engine 103 compiles it into executable program code for the APP created by the user.
Specifically, the code generation engine 103 can acquire the IDs and names of a plurality of components that have completed layout in the component layout designer 102, and connection points, connection attributes between connection points. For example, the output connection point 506 of the parsing component and the input connection point 602 of the playing video component shown in fig. 6A, the data entity type transmitted between the two connection points may be video data with a file type MPEG4 and a format 1920 × 1080. Data is passed directly between these two connection points without decision or looping logic. Then, the code generation engine 103 generates executable code according to the acquired component information, the connection attributes between the connection points, and the corresponding component call template. The component calling template is a code template preset according to different types of components and the attributes of the connection points, and a universal interface calling code logic is packaged. It will be appreciated that the component call templates for the video class and the audio class may be different. The two connection points directly transmit data, or need to judge conditions, and the component calling templates corresponding to different data transmission modes, such as data transmission, can be different only if the judgment conditions are met. Reference is made specifically to the above description of the component invocation template, and the specific code in the component invocation template is not limited herein.
In some examples, the electronic device receives a user action to display a component call tree for the first component, in response to which the component orchestration designer 102 initiates the intelligent orchestration function; based on the intelligent orchestration functionality, a component call tree for the first component may be displayed in the interface of the component orchestration designer 102; the component calling tree is used for showing all matched second components and/or third components of the first component, fourth components and/or fifth components matched with the second components, and the nth components matched with the mth components, wherein the nth components are components without output connection points, and M and N are positive integers. In particular, the component orchestration designer 102 finds matching second and/or third components based on the data entity types supported by the output connection points of the first component. I.e. the data entity types supported by the input connection points of the second and third components match the data entity types supported by the output connection points of the first component. Then, the component layout designer 102 searches for a fourth component and/or a fifth component matching the second component according to the data entity type supported by the output connection point of the second component. The data entity types supported by the input connection points of the fourth and fifth components match the data entity types supported by the output connection points of the second component. The component orchestration designer 102 finds the matching components for each component in the component call tree until the last level of components in the component call tree are no output connection points.
The operation for displaying the component call tree of the first component may be various, for example, a user may click a control for displaying the component call tree in the user interface, or a user may set a condition for displaying the component call tree in a page, for example, a component in the component call tree may implement one or more functions (e.g., a video decoding function, a video playing function, etc.), which is not limited herein.
In response to the user operation, the electronic device displays a user interface 80AH as shown in fig. 8A, the component call tree for component 1 in the user interface 80A. Module 1 can be connected to module 2 and module 3. I.e. component 1 and component 2 match and component 1 and component 3 match. The module 2 can be connected to the modules 4 and 5. I.e. component 2 and component 4 are mated and component 2 and component 5 are mated. The component 3 can be connected to the component m, i.e. the component 3 and the component m match. Component 4, component 5, component m may be connected with component n. I.e., component 4 matches component n, component 5 matches component n, and component m matches component n. The user can select a part of the components in the component call tree, for example, component 2 and component 4, component n. Other unneeded components are then removed or deleted. For example, the user right double-clicks on a component to delete the component, and the manner of removal or deletion is not limited.
TABLE 2
Figure BDA0003419217490000161
Table 2 exemplarily shows the data entity types supported by the connection points of component 1, component 2, component 3, and component m. In table 2, "MPEG 4,1920 × 1080, Chinese" indicates that the data entity supported by the connection point is video data in the format of MPEG4, the resolution of the video data is 1920 × 1080, and the language supported by the video data is Chinese. The "MP 3, 112800, Chinese" indicates that the data entity supported by the connection point is audio data in the format of MP3, the adoption rate of the audio data is 112800Hz, and the supported language is Chinese. As can be seen from table 2, the data entity types supported by the input connection point of component 2 include the data entity types supported by the output connection point of component 1. Thus, the output connection point of component 1 can be connected to the input connection point of component 2, and component 1 and component 2 are mated. The data entity types supported by the input connection point of the component 3 include the data entity types supported by the output connection point of the component 1. Thus, the output connection point of the component 1 can be connected to the input connection point of the component 3, the component 1 and the component 3 being matched. The data entity types supported by the input connection point of component 4 include the data entity types supported by the output connection point2 of component 2. Thus, the input connection point of the component 4 can be connected to the output connection point2 of the component 2, the component 2 and the component 4 being matched. The data entity types supported by the input connection point of component 5 comprise the data entity types supported by the output connection point2 of component 2. Thus, the input connection of component 5 can be connected to the output connection 2 of component 3, with component 5 and component 2 being matched. The data entity types supported by the input connection point of component m comprise the data entity types supported by the output connection point of component 2. The output connection point of the component m can thus be connected to the input connection point of the component 3, the component 3 and the component m being matched. The data entity types supported by the input connection point of component n comprise the data entity types supported by the output connection point of component 4, the output connection point of component 5 and the output connection point of component m. Thus, the input connection point of component n may be connected to the output connection point of component 4, the output connection point of component 5, and the output connection point of component m. Module n matches module 4, module 5, and module m.
It is to be understood that the types of data entities supported by the connection points of the various components shown in table 2 are merely examples. The connection points of the various components shown in FIG. 8A may support any of the data entity types shown in Table 1. The embodiment of the present application does not limit the types of data entities that can be supported by the connection points of the components in fig. 8A.
Illustratively, as shown in FIG. 8B, the user interface 80B may display a decomposed video component, which may include a body (decomposed video 503) and input connection points 504, output connection points 505, and output connection points 506. The user does not know which components in the component toolkit 101 can break the video component's output connection point 506 connections. The user may click on the intelligent layout control 508 in the user interface 80B.
In response to the user operation, the electronic device displays a user interface 80C as shown in fig. 8C. The component call tree for the decomposed video component is displayed in the user interface 80C. As shown in fig. 8C, a video play component can be included in the component call tree of the split video component, and the play video component includes a main body (play video 717) and an input connection point 715. When the user clicks on the intelligent layout control 508, the component layout designer 102 may look for a component (e.g., video playback component) in the component toolkit 101 that matches the output connection point based on the data entity type of the output connection point 506 and display it in an interface (e.g., user interface 80C) of the component layout designer 102.
In one possible implementation, in response to the user operation, the electronic device may also display a user interface 80D as shown in fig. 8D. The component call tree for the decomposed video component is displayed in the user interface 80D. As shown in fig. 8D, a high definition enhancement component and a video playing component may be included in the component call tree of the decomposed video component. The high definition enhancement module includes a module body (high definition enhancement 711), input connection point 708, and output connection point 713. The play video component includes a component body (play video 717) and input connection point 715. The output connection point 506 may be connected to the input connection point 708. The output connection point 713 may be connected to the input connection point 715.
In one possible implementation, the user may select the import nexus 504 of the decomposed video component for intelligent programming. For example, right clicking on input nexus 504 selects a smart orchestration control, and in response to a user action, a component that may be connected to input nexus 504 may be displayed in component orchestration designer 102. For example, read video component 702 shown in FIG. 7.
Therefore, when the user does not know which component can be connected with the component, other components which can be connected with the component can be quickly found, and time of the user is saved.
An APP development method provided by the embodiment of the present application is described below with reference to the drawings. Fig. 9 is a schematic flow diagram of an APP development method according to an embodiment of the present application. Referring to fig. 9, an APP development method provided in an embodiment of the present application specifically includes:
s101, the electronic equipment responds to the operation of creating the first APP by the user, and creates the first APP.
The electronic device may detect an operation of a user to create a first APP. The operation of the user to create the first APP may be many. For example, a user opens a user interface in the electronic device that provides for the creation of an APP, and then creates the APP in the user interface. The user interface for creating an APP here can be seen in the user interface 400 shown in fig. 4. How a user creates an APP in a user interface may be seen above in the description of fig. 4. And will not be described in detail herein.
S102, responding to the operation that a user selects a component from a component tool box of the electronic equipment, displaying the component in a component arrangement designer by the electronic equipment, wherein the component is an independent module for realizing a specific function; a component is made up of a component body and one or more connection points that support one or more data entity types.
The user may select a component from the component toolkit in a variety of operations, such as the user dragging a component in the component toolkit or the user double-clicking a component in the component toolkit with a mouse. The user operation is not limited here.
The connection points of the components support one or more data entity types. The data entity types can be referred to the above description of table 1, and are not described herein. Here, a developer may define the connection points of a component with program code when developing the component. For example, the data entity types supported by the connection points in the component are defined, as well as the functions contained in the connection points. Here, the developer may define that the connection points of all components support the same variety of functions. Such as create function for creating data entities, connect function for connecting to other connection points, accept function for receiving, and pull function for storing received data locally, push function for sending data, etc.
In one possible implementation, in response to an operation of a user selecting a component from a component toolbox of the electronic device, the displaying, by the electronic device, a graphic of the plurality of components specifically includes: responding to the operation of selecting components from a component toolbox of the electronic equipment by a user, acquiring a first file of a plurality of components forming a first APP by the electronic equipment, and drawing a graph of the components after analyzing the first file; a graph of a plurality of components is displayed, and a first file is program code describing functions and attributes of the components. The user's selection of components from a component toolkit of an electronic device is used to select a plurality of components, the graphics of the components being used to present the body and connection points of the components, the components comprising a body, one or more connection points, such as the graphics of the exploded video component shown in fig. 5. The user's operation to select a component from the component toolkit for the electronic device may be to drag the component in the component toolkit 101 into the component orchestration designer 102. The operation of selecting a component from the component toolbox of the electronic device by the user may also be a single click or a double click on the selected component, and the second user operation is not limited in the embodiment of the application.
In one possible example of an embodiment of the present application, an electronic device may receive a user operation of a user dragging a component from a toolbox. In response to the user operation, the electronic device obtains the file of the component and parses the file of the component. The electronic device then draws a graphic of the component. Specifically, the electronic device may obtain files of the component, parse the files of the component, and draw a graph of the component through the component orchestration designer 102 in the electronic device. Reference may be made to the description of fig. 5A above, which is not repeated here.
Here, the user may select the components according to the requirements of the APP developed by the user. For example, the video playing APP shown in fig. 4-fig. 7 needs to be composed of a component for reading a video file, a component for decomposing a video file, a component for translating a voice, a component for enhancing a high definition, a component for playing an audio, a component for playing a video, and the like. After the user selects a component from the component toolkit 101, the electronic device obtains a first file of the component. The electronic device may parse the component from the first file to obtain the number of connection points, the connection point attribute, the attribute that the connection point can transmit the data entity, and the like. And the electronic equipment draws the component into a visual component graph according to the number of the connection points contained in the component analyzed by the first file. The user can see the component graphics in the user interface. The component graphics of the decomposed video component are shown in fig. 5.
In one possible implementation manner, the electronic device receives a user operation for displaying a component calling tree of a first component, and in response to the user operation, the electronic device displays the component calling tree of the first component; the component call tree is used for showing all matched second components and/or third components of the first component, fourth components and/or fifth components matched with the second components, and the nth components matched with the mth components, and the nth components are components without output connection points. Specifically, the electronic device finds the matched second component and/or third component according to the data entity type supported by the output connection point of the first component. That is, the types of data entities supported by the input connection points of the second component and the third component are matched with the types of data entities supported by the output connection points of the first component. And then, the electronic equipment searches for a fourth component and/or a fifth component matched with the second component in turn according to the data entity type supported by the output connection point of the second component. The data entity types supported by the input connection points of the fourth and fifth components match the data entity types supported by the output connection points of the second component. The electronic device will find out the matched components of each component in the component calling tree until the last layer of components in the component calling tree is the output-free connection point. Specifically, reference may be made to the description of the call tree of the component 1 in fig. 8A, which is not described herein again.
The user action of displaying the component call tree can be many and, illustratively, as shown in FIG. 8C, the user can click on intelligent orchestration control 508. The embodiment of the application does not limit the user operation of the display component calling tree. When the user does not know which component in the component toolkit 101 matches component 1, the user can select the output nexus of component 1 and then click the right button to select intelligent orchestration. And the electronic equipment responds to the operation of clicking the intelligent arrangement by the user and displays the component calling tree of the component 1. In this way, the user can quickly select the components needed to develop the APP in the component call tree. Therefore, the time of the user can be saved, and the user experience is improved.
In a possible implementation manner, the electronic device displays a component call tree of a first component in a component orchestration designer, specifically including: the electronic device displays a component call tree for the first component in the component orchestration designer based on the functionality of the first component and/or the type of data entity supported by the first component connection point.
In one possible implementation, in response to a user operation to delete a component, the electronic device deletes the component of the component call tree.
In one possible implementation, in response to an operation of a user uploading or downloading a component from a component market, the electronic device displays, in a component toolkit, a name of the component uploaded or downloaded by the user from the component market.
In one possible implementation, in response to a user operation to view first connection point properties of a first component, the electronic device displays, in the component orchestration designer, data entity types supported by the first connection point.
S103, responding to the operation of connecting a plurality of components by a user, the electronic device connects two or more components in the component arrangement designer.
The electronic device can receive the operation of connecting a plurality of components by a user, and the operation of connecting the plurality of components by the user can be various. For example, the user connecting the plurality of components may be the user dragging an output connection point of the second component towards an input connection point of the first component. The operation of the user connecting the plurality of components may also be the user sliding the second component in the direction of the first component. Alternatively, the operation of connecting a plurality of components by the user may be inputting an output connection point of the first component and an input connection point of the second component by the user. The third user operation is not limited in the embodiment of the application. Reference may be made to the above description of fig. 6, which is not repeated here.
In one possible implementation, the electronic device connects two or more components in the component orchestration designer, specifically including: in response to an operation of connecting the first component and the second component by a user, the electronic device verifying, by the component orchestration designer, whether the first component and the second component match; if the first assembly is matched with the second assembly, the electronic equipment is connected with the first connection point and the second connection point, the first connection point is the connection point of the first assembly, and the second connection point is the connection point of the second assembly.
In one possible implementation, in response to a user operation to connect multiple components, the electronic device may perform a process to connect the first component and the second component. First, the electronic device obtains the data entity types supported by the output connection point of the first component and the data entity types supported by the input connection point of the second component. When the electronic equipment determines that the type of the data entity output by the output connection point of the first component is matched with the type of the data entity of the input connection point of the second component, the electronic equipment establishes connection between the first component and the second component. That is, the type of data entity supported by the egress connection point of the first component is the same as the type of data entity supported by the ingress connection point of the second component, or the type of data entity supported by the egress connection point of the first component includes the type of data entity supported by the ingress connection point of the second component, or the type of data entity supported by the ingress connection point of the second component includes the type of data entity supported by the egress connection point of the first component, the first component may establish a connection with the second component. Reference may be made to the above description of fig. 6, which is not repeated here.
For example, as shown in table three, the first component output connection point supports video data with data entity type of MPEG format and size of 640 × 480. The second component input connection point supports video data with data entity type of MPEG format, size 640 x 480, 1080P, 2K.
TABLE 3
Figure BDA0003419217490000201
In this way, the data types supported by the output connection points of the first component match the data types supported by the input connection points of the second component. The electronic device establishes a connection between the first component and the second component. The type of data entity can be referred to table 1 here.
In one possible implementation, the electronic device may present a connection success indicator after determining that the two components can establish a connection. The Connection success identifier may be a Connection line (Connection) between the two components. For example, as shown in fig. 6A as user interface 60A. After the decomposed Video component and the played Video component shown in the user interface 60A are connected, the electronic device draws a connection line, which is used to connect the decomposed Video component Output connection point (Video Output connection point) and the played Video Input connection point (Input connection point 2). The developer selects components from the component toolkit 101 and drags them to the component design orchestrator 102, one by one, according to the logic flow of the APP. The electronic equipment responds to user operation and establishes connection of all the components forming the APP in sequence.
In one possible implementation, the connection success flag may be a fold or overlap of two connection points where two components are connected. Reference may be made to the description of fig. 6B above, which is not repeated here.
In a possible implementation manner, if the type of the data entity output by the output connection point of the first component does not match the type of the data entity input by the input connection point of the second component, the electronic device displays a prompt box, where the prompt box is used to prompt a user that the connection of the first component and the second component fails. The prompt content of the prompt box can be various. Illustratively, the content of the prompt box may be "connection failure". The content of the prompt box may also be "type of data entity does not match". The content of the prompt box may also be "the first component and the second component cannot be connected", and the like, and the specific content of the prompt box is not limited herein.
And S104, in response to the operation that the user selects two or more compiled components, compiling the plurality of connected components into the program codes of the first APP by the electronic equipment.
After the components that make up the first APP have all been connected, the user may click on a control for producing program code with the connected components. The electronic device can detect an operation of the user clicking the control. In response to the operation, the electronic device compiles the plurality of components that complete the connection into program code of the first APP. It will be appreciated that the program code of the first APP is used to describe the logical functions and user interface of the first APP. When the program code of the first APP is installed in the electronic device of the user, the electronic device may run the first APP.
Specifically, the electronic device may generate, by code generation engine 103, program code for a first APP to compose a connected component in orchestration designer 102.
In one possible implementation, the electronic device saves, in the component orchestration designer, an orchestration model map of two or more components that complete the connection, and first information in the orchestration model map; the first information comprises one or more of an ID, a name of two or more components in the first information, and a data entity type supported by a connection point of the two or more components.
In a possible implementation manner, the electronic device generates, in the code generation engine, source code executable by the first APP according to the layout model diagram, the first information, and a component calling template, where the component calling template includes program code in a preset format. For details, reference may be made to the description of the component invocation template in the foregoing, and details are not described herein again.
According to the APP development method provided by the embodiment of the application, a user can connect a plurality of components in a toolbox to form an APP. During the connection of multiple components, the electronic device needs to determine whether the types of data entities supported by the connection points of the two connected components match. If so, the electronic device may display a connection success indicator. Finally, the electronic equipment generates program codes of the APP by the completed components. Therefore, the user can quickly develop the APP through the existing component, and the time for developing the APP by the user is shortened.
An APP application scenario developed by the APP development method provided by the embodiment of the present application is described below. Fig. 10 shows an application scenario of APP developed by the present application. As shown in fig. 10, an APP composed of a communication component, a storage component, a playing component, a camera component, a gesture input component, an audio playing component, and a media downloading acceleration component is taken as an example for explanation. User a has electronic devices such as a mobile phone, a TV, a PC, a router, a speaker, a watch, and a car machine. All of the user's electronic devices are connected to the same wireless network (e.g., a Wi-Fi network in the home). The mobile phone of a user is provided with an APP comprising a communication component, a storage component, a playing component, a camera shooting component, a gesture input component, an audio playing component and a media downloading acceleration component. The PC has a memory component installed therein. The router has installed therein a media download acceleration component. An audio playing component is installed in the sound box. The watch is provided with a gesture input assembly. The car machine is provided with a camera shooting assembly. The TV installs a play component. Then, when the user runs the APP in the mobile phone, the mobile phone may execute the function of the communication component, select the function of the TV execution playing component to play the video in the APP, select the PC to run the function of the storage component to store the data in the APP, select the router to execute the media downloading acceleration component to accelerate downloading of the media files in the APP, select the speaker to execute the function of audio playing to play the audio in the APP, select the watch to execute the gesture input component to input a gesture to control the APP, and select the car machine to execute the function of the camera component to shoot the image or video required by the APP. Thus, the components constituting the APP can be run on different electronic devices, respectively, and a user can perform a plurality of functions of the APP using the different electronic devices. Thus, each electronic device can exert its own advantages (for example, the display screen of the TV is larger than that of the mobile phone, the mobile phone communication is more convenient, and the like), so that the user experiences better when using the APP.
The tool box and the composite component and APP developed by the developer according to the method provided by the embodiment of the application can form a distributed component development ecology. FIG. 11 shows an ecological diagram of component development provided by an embodiment of the present application. As shown in FIG. 11, a component developer can query and invoke components from a component marketplace. Component developers can also summarize and refine components from existing applications. The component developer can upload developed components or components that are abstracted from applications to the component marketplace. This forms a component development ecology. Component developers can develop APPs with components in the component marketplace with great convenience. The component toolkit 101 above may download components from the component marketplace. This allows the components in the expansion component toolkit 101 to be updated.
In one possible implementation, the electronic device invokes and connects components according to a Domain Specific Language (DSL) input by the user. Fig. 12 shows a schematic diagram of the correspondence of component DSL language to component. As shown in fig. 12, the DSL language (Comp 1; ConnPoint1) in the figure means that there is a component 1, the component 1 comprising a connection point1. This corresponds to component 1 and connection point1 in the component diagram. The DSL language (Comp 2; ConnPoint2) means that there is a component 2, which component 2 comprises a connection point2. This corresponds to component 2 and connection point2 in the component diagram. The DSL language (link entry 1, entry 2) indicates that the connection point1 supports data entity1 and the connection point supports data entity 2. If nexus 1 and nexus 2 are connected, then nexus 1 needs to see if data entity1 and data entity2 are the same. Likewise, the connection point2 also needs to see if the data entities 1 and 2 are the same. Thus, when the user is very familiar with the DSL language of the component, the user may not develop an APP according to the APP development method flow shown in fig. 9. A user may program DSL in an electronic device to develop APP. In this way, the user can develop the APP more efficiently.
For example, the user can directly connect the split video component and the play video component shown in fig. 6A by using the DSL language. I.e., without performing the operations of fig. 4-5G. The DSL language to break down the video component and play the video component for the connection may be as follows:
Compomentdef
video:// split video component
Connpint 1.entity1(vedio, MPEG4,1920 × 1080)// connection point1 for decomposed video components and data entity1 supported by the connection point
Video layer:// video playing component
Connpoint2.entity2 (video, MPEG4,1920 1080)// video playback component connection point2 and data entity2 supported by the connection point
Link entry 1, entry 2// establishing a connection between connection point1 and connection point2
In the DSL code shown above, "video _ plug" may represent the decomposed video component in fig. 6A, and "Connpoint 1" corresponds to the output connection point 506 in fig. 6A. "entity 1(vedio, MPEG4,1920 × 1080)" indicates the data entity types supported by the output connection point 506. "videolayer" may represent the video playback component in fig. 6A, and "Connpoint 2" corresponds to the input connection point 602 in fig. 6A. "entity 2(vedio, MPEG4,1920 × 1080)" indicates the type of data entity supported by the output connection point 602.
In one possible implementation, the user may write the decision condition in DSL language, for example, connection point 707 shown in fig. 7 may be connected to connection point 709. Connection point 707 may also be connected to connection point 714. The user may set a judgment condition for the connection point 707, the judgment condition being whether the language supported by the connection point 707 coincides with the local language of the electronic apparatus. If so, the output of the video decomposition component is input to the input connection point 714 of the audio playback component via the output connection point 707. If not, the output of the video decomposition component is input to input connection point 709 of language translation component 710 via output connection point 707.
The decision condition for the connection point 707 may be implemented in the following DSL language:
condition 1/Condition 1
State type if property language contact local language if the connection point supports language property including local language
Result type: connect-branch1// connecting line 1
Condition[2]://
State type if Propertyname language not contact language local language/if no local language is included in the language attribute supported by the connection point
Result type connect-branch2// connecting line 2
In the DSL language described above, connect-branch1 may represent a connection line between connection point 707 and connection point 714. connect-branch2 may represent a connecting line between connection point 707 and connection point 709. It will be appreciated that prior to the DSL language described above, the user may define which connection point between which the connection-branch 1 and the connection-branch 2 specifically represent. Here, the DSL language is only an example, and the application does not limit the DSL language implementation.
When using DSL, the user can write only the connection points of the two components that are needed to establish a connection. The input connection point 504 and the output connection point 505 of the decomposed video component as in fig. 6A may not be written out. Therefore, the user can realize the connection between the components by only inputting a few lines of codes, and the time of the user is saved.
An exemplary electronic device 100 provided in the following embodiments of the present application is next described.
Fig. 13 shows a schematic structural diagram of the electronic device 100.
The following describes an embodiment specifically by taking the electronic device 100 as an example. It should be understood that electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include a processor 101, a memory 102, a transceiver 103, a display screen 104, sensors 105, and the like, wherein:
the processor 101 may be configured to obtain data entity types supported by connection points of the components, determine whether the data entity types supported by the connection points of the two components match, and search for a component matching the data entity type supported by the output connection point of the component according to a user operation.
In some embodiments, processor 101 may include one or more processing units, such as: the processor 101 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 101, thereby increasing the efficiency of the system.
In some embodiments, processor 101 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 101 may include multiple sets of I2C buses. The processor 101 may be coupled to a touch sensor, a charger, a flash, a camera 193, etc. via different I2C bus interfaces, respectively. For example: the processor 101 may be coupled to the touch sensor via an I2C interface, such that the processor 101 and the touch sensor communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 101 may include multiple sets of I2S buses. The processor 101 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 101 and the audio module 170. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module through the I2S interface, so as to receive the call through the bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit the audio signal to the wireless communication module through the PCM interface, so as to implement the function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 101 with the wireless communication module. For example: the processor 101 communicates with a bluetooth module in the wireless communication module through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module through the UART interface, so as to realize the function of playing music through the bluetooth headset.
The MIPI interface may be used to connect the processor 101 with peripheral devices such as the display screen 104, the camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 101 and the camera 193 communicate through a CSI interface to implement the capture functionality of the electronic device 100. The processor 101 and the display screen 104 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 101 with the camera 193, the display screen 104, the wireless communication module, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The memory 102 may be used to store computer-executable program code, which includes instructions. The processor 101 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the memory 102. The memory 102 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. Further, the memory 102 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The transceiver 103 may be used to communicate with network devices, other electronic devices. The electronic device 100 may upload or download components via the transceiver 103. In some embodiments, the transceiver 103 may include a mobile communication module (not shown in the figures) and a wireless communication module (not shown in the figures), wherein:
the mobile communication module may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module can also amplify the signal modulated by the modulation and demodulation processor and convert the signal into electromagnetic wave to be radiated by the antenna 1. In some embodiments, at least part of the functional modules of the mobile communication module may be provided in the processor 101. In some embodiments, at least part of the functional modules of the mobile communication module may be provided in the same device as at least part of the modules of the processor 101.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to a speaker, a receiver, etc.) or displays images or video through the display screen 104. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module or other functional modules, independent of the processor 101.
The wireless communication module may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module may be one or more devices integrating at least one communication processing module. The wireless communication module receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 101. The wireless communication module may also receive a signal to be transmitted from the processor 101, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
The electronic device 100 implements display functions via the GPU, the display screen 104, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 104 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 101 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 104 may be used to display graphics of components, as well as toolbars, component layout designers, and the like. The display screen 104 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 104, N being a positive integer greater than 1.
The sensor 105 may be used to detect user operations, such as a user dragging a component, a user sliding a component, and so forth. The sensors 105 may include pressure sensors and touch sensors, wherein:
the pressure sensor is used for sensing a pressure signal and converting the pressure signal into an electric signal. In some embodiments, the pressure sensor may be disposed on the display screen 104. There are many types of pressure sensors, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 104, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor. The electronic apparatus 100 may also calculate the touched position based on the detection signal of the pressure sensor. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
Touch sensors, also known as "touch panels". The touch sensor may be disposed on the display screen 104, and the touch sensor and the display screen 104 form a touch screen, which is also called a "touch screen". The touch sensor is used to detect a touch operation applied thereto or nearby. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 104. In other embodiments, the touch sensor may be disposed on a surface of the electronic device 100 at a different location than the display screen 104.
Fig. 14 is a schematic block diagram of an electronic device 200 according to an embodiment of the present application. As shown in fig. 14, the electronic device 200 may include a detection unit 201, a processing unit 202, and a display unit 203. Wherein the content of the first and second substances,
a detection unit 201, configured to detect a user operation received by the electronic device 200, for example, a user drags a component from a component toolbox, the user drags an input connection point of a second component to an output connection point of a first component, and so on.
A processing unit 202, configured to, in response to the user operation detected by the detection unit 201, obtain a data entity type supported by the connection point of the component, and determine that the output connection point of the first component matches the input connection point of the second component.
And the display unit 203 is used for displaying the graph of the component, the data entity type supported by the connection point of the component and the identification of successful connection of the two components.
Each unit and the other operations or functions in the user equipment 200 according to the embodiment of the present application are respectively for a corresponding process executed by the electronic device in the APP development method, and are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (24)

1. The utility model provides an application APP development platform which characterized in that is applied to electronic equipment, APP development platform includes:
the component tool box is used for providing components which are independent modules for realizing specific functions, and each component consists of a component main body and one or more connection points, and the connection points support one or more data entity types;
a component arrangement designer for displaying the components, connecting two or more components according to the operation of the user connecting components;
a code generation engine to generate source code executable by a first APP from the two or more components connected in the component orchestration designer, the first APP including the two or more components;
wherein the component orchestration designer is further to:
responding to the operation that the user selects intelligent arrangement on the first component, and displaying a component calling tree of the first component; the component call tree is used for showing a second component and/or a third component matched with the first component, a fourth component and/or a fifth component matched with the second component, and an Nth component matched with the Mth component, wherein the Nth component is a component without an output connection point, and M and N are positive integers.
2. The APP development platform of claim 1, wherein the two or more components include a first component and a second component, the component orchestration designer further configured to:
verifying whether the first component and the second component are matched in response to an operation of the user connecting the first component and the second component;
and if the first assembly is matched with the second assembly, connecting a first connecting point and a second connecting point, wherein the first connecting point is the connecting point of the first assembly, and the second connecting point is the connecting point of the second assembly.
3. The APP development platform of claim 2, wherein the first component and the second component match comprises: the first data entity type is the same as the second data entity type, the first data entity type includes the second data entity type, or the second data entity type includes the first data entity type, wherein the first data entity type is the type of the data entity supported by the first connection point, and the second data entity is the type of the data entity supported by the second connection point.
4. The APP development platform of claim 1, wherein said component toolkit is further configured to:
in response to an operation of the user uploading or downloading components from a component market, displaying names of components uploaded or downloaded by the user from the component market.
5. The APP development platform of claim 2, wherein said component orchestration designer is further configured to:
and responding to the operation of the user for viewing the attribute of the first connecting point, and displaying the data entity types supported by the first connecting point.
6. The APP development platform of claim 5, wherein the component orchestration designer is specifically configured to:
displaying a connection line connecting the first connection point and the second connection point in response to an operation of the user connecting the first connection point and the second connection point.
7. The APP development platform of claim 5, wherein the component orchestration designer is specifically configured to:
in response to an operation of the user connecting the first connection point and the second connection point, the first connection point and the second connection point are displayed in an overlapping manner.
8. The APP development platform of claim 1, wherein the component orchestration designer is specifically configured to:
displaying the first component according to an operation of the user selecting the first component from the component toolbox.
9. The APP development platform of claim 1, wherein the displaying the component call tree of the first component is specifically:
and displaying the component call tree of the first component according to the function of the first component and/or the data entity type supported by the connection point of the first component.
10. The APP development platform of claim 9, wherein the component orchestration designer is further to:
and in response to the operation of deleting the second component by the user, deleting the second component in the component call tree.
11. The APP development platform of any one of claims 1-10, wherein the component orchestration designer is further configured to:
saving an arrangement model diagram of the two or more assemblies which are connected and first information in the arrangement model diagram; the first information comprises one or more of an ID, a name of the two or more components, and a data entity type supported by a connection point of the two or more components.
12. The APP development platform of claim 11, wherein the code generation engine is specifically configured to:
and generating a source code which can be executed by the first APP according to the arrangement model diagram, the first information and a component calling template, wherein the component calling template comprises a program code with a preset format.
13. An APP development method, comprising:
in response to an operation of a user selecting a component from a component toolkit of an electronic device, the electronic device displaying the component in a component orchestration designer, the component being an independent module that implements a specific function; the component is composed of a component body and one or more connection points, and the connection points support one or more data entity types;
in response to a user operation to connect a plurality of components, the electronic device connecting two or more components in the component orchestration designer;
in response to a user selecting an operation to compile the two or more components, the electronic device generating, in a code generation engine, executable source code of a first APP from the two or more components to be connected;
the method further comprises the following steps:
in response to the user selecting an operation for intelligently arranging a first component, the electronic device displays a component call tree of the first component in the component arrangement designer; the component call tree is used for showing a second component and/or a third component matched with the first component, a fourth component and/or a fifth component matched with the second component, and an Nth component matched with the Mth component, wherein the Nth component is a component without an output connection point, and M and N are positive integers.
14. The method of claim 13, wherein the two or more components include a first component and a second component, and wherein the electronic device connects the two or more components in the component orchestration designer, specifically comprising:
in response to an operation of the user connecting the first component and the second component, the electronic device verifying, by the component orchestration designer, whether the first component and the second component match;
if the first assembly is matched with the second assembly, the electronic equipment is connected with a first connection point and a second connection point, the first connection point is the connection point of the first assembly, and the second connection point is the connection point of the second assembly.
15. The method of claim 14, wherein the mating of the first component and the second component comprises: the first data entity type is the same as the second data entity type, the first data entity type comprises the second data entity type, the second data entity type comprises the first data entity type, the first data entity type is the type of the data entity supported by the first connection point, and the second data entity is the type of the data entity supported by the second connection point.
16. The method of claim 13, further comprising:
in response to an operation of the user uploading or downloading components from a component marketplace, the electronic device displays, in the component toolkit, names of components uploaded or downloaded by the user from the component marketplace.
17. The method of claim 14, further comprising:
in response to the user's operation to view the first connection point attributes of the first component, the electronic device displays, in the component orchestration designer, data entity types supported by the first connection point.
18. The method of claim 14, wherein the electronic device connecting a first connection point and a second connection point comprises:
displaying a connecting line connecting the first connecting point and the second connecting point;
or, overlapping the first connection point and the second connection point.
19. The method of claim 13, wherein the electronic device displays the component call tree of the first component in the component orchestration designer, and in particular comprises:
and the electronic equipment displays the component calling tree of the first component in the component arrangement designer according to the function of the first component and/or the data entity type supported by the connection point of the first component.
20. The method of claim 19, further comprising:
and in response to the operation of deleting the second component by the user, deleting the second component in the component calling tree by the electronic equipment.
21. The method according to any one of claims 13-20, further comprising:
the electronic device saves an arrangement model map of the two or more components completing the connection and first information in the arrangement model map in the component arrangement designer; the first information comprises one or more of an ID, a name of the two or more components, and a data entity type supported by a connection point of the two or more components.
22. The method of claim 21, further comprising:
and the electronic equipment generates the executable source code of the first APP in the code generation engine according to the arrangement model diagram, the first information and the component calling template, wherein the component calling template comprises a program code with a preset format.
23. An electronic device, comprising: one or more processors, one or more memories; the one or more memories are respectively coupled with the one or more processors; the one or more memories for storing computer program code, the computer program code comprising computer instructions; the computer instructions, when executed on the processor, cause the electronic device to perform the APP development method of any one of claims 13-22.
24. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the APP development method of any one of claims 13-22.
CN202111556609.3A 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment Active CN114371844B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111556609.3A CN114371844B (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111556609.3A CN114371844B (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment
CN202010569877.8A CN113821203A (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202010569877.8A Division CN113821203A (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114371844A CN114371844A (en) 2022-04-19
CN114371844B true CN114371844B (en) 2022-09-23

Family

ID=78924852

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010569877.8A Pending CN113821203A (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment
CN202111556609.3A Active CN114371844B (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010569877.8A Pending CN113821203A (en) 2020-06-20 2020-06-20 APP development platform, APP development method and electronic equipment

Country Status (2)

Country Link
CN (2) CN113821203A (en)
WO (1) WO2021254167A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113821203A (en) * 2020-06-20 2021-12-21 华为技术有限公司 APP development platform, APP development method and electronic equipment
CN114461208A (en) * 2022-01-06 2022-05-10 深圳安巽科技有限公司 Software automation arrangement method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106557314A (en) * 2016-10-19 2017-04-05 深圳智慧林网络科技有限公司 Applied software development method and device
CN107844299A (en) * 2017-12-01 2018-03-27 浪潮软件股份有限公司 A kind of implementation method of Web application development tools

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7716636B2 (en) * 2005-01-10 2010-05-11 Microsoft Corporation User interface for accessing and loading software components of a development set on a computer while other software components of the set are loading
WO2007088602A1 (en) * 2006-02-01 2007-08-09 Fujitsu Limited Object relation display program and object relation display method
US9619304B2 (en) * 2008-02-05 2017-04-11 Adobe Systems Incorporated Automatic connections between application components
CN101944017B (en) * 2009-07-09 2014-03-12 华为技术有限公司 Method and device for producing Widget
CN102087597B (en) * 2011-02-14 2014-08-20 浪潮通信信息系统有限公司 J2EE and component set-based visualized development platform
CN102799430B (en) * 2012-07-02 2015-07-15 电子科技大学 Mobile internet business-oriented off-line visual business development generator
US20140282371A1 (en) * 2013-03-14 2014-09-18 Media Direct, Inc. Systems and methods for creating or updating an application using a pre-existing application
US9947140B2 (en) * 2015-09-15 2018-04-17 Sartorius Stedim Biotech Gmbh Connection method, visualization system and computer program product
CN105512304B (en) * 2015-12-11 2019-03-26 西安道同信息科技有限公司 It is online to generate internet application method and system integration method and support platform
CN110187875A (en) * 2019-05-28 2019-08-30 深圳市智慧郎数码科技有限公司 A kind of component visual melts forwarding method
CN113821203A (en) * 2020-06-20 2021-12-21 华为技术有限公司 APP development platform, APP development method and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106557314A (en) * 2016-10-19 2017-04-05 深圳智慧林网络科技有限公司 Applied software development method and device
CN107844299A (en) * 2017-12-01 2018-03-27 浪潮软件股份有限公司 A kind of implementation method of Web application development tools

Also Published As

Publication number Publication date
WO2021254167A1 (en) 2021-12-23
CN114371844A (en) 2022-04-19
CN113821203A (en) 2021-12-21

Similar Documents

Publication Publication Date Title
CN110597512B (en) Method for displaying user interface and electronic equipment
WO2020211709A1 (en) Method and electronic apparatus for adding annotation
WO2020238356A1 (en) Interface display method and apparatus, terminal, and storage medium
WO2021129253A1 (en) Method for displaying multiple windows, and electronic device and system
WO2021082835A1 (en) Method for activating function and electronic device
CN114371844B (en) APP development platform, APP development method and electronic equipment
WO2023130921A1 (en) Method for page layout adapted to multiple devices, and electronic device
WO2022057852A1 (en) Method for interaction between multiple applications
WO2021169466A1 (en) Information collection method, electronic device and computer-readable storage medium
CN112015943A (en) Humming recognition method and related equipment
CN112116690A (en) Video special effect generation method and device and terminal
CN111125602A (en) Page construction method, device, equipment and storage medium
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114115870A (en) User interface implementation method and device
CN109597620B (en) Data processing method, device, equipment and storage medium
WO2020181505A1 (en) Input method candidate content recommendation method and electronic device
WO2022057889A1 (en) Method for translating interface of application, and related device
CN115700461A (en) Cross-device handwriting input method and system in screen projection scene and electronic device
CN114595449A (en) Safety scanning method and device
CN115941674B (en) Multi-device application connection method, device and storage medium
WO2023241544A1 (en) Component preview method and electronic device
CN116743908B (en) Wallpaper display method and related device
WO2023179454A1 (en) Service calling method and electronic device
WO2022089276A1 (en) Collection processing method and related apparatus
CN117762537A (en) Card sharing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant