WO2021254167A1 - App开发平台、app开发方法及电子设备 - Google Patents

App开发平台、app开发方法及电子设备 Download PDF

Info

Publication number
WO2021254167A1
WO2021254167A1 PCT/CN2021/098215 CN2021098215W WO2021254167A1 WO 2021254167 A1 WO2021254167 A1 WO 2021254167A1 CN 2021098215 W CN2021098215 W CN 2021098215W WO 2021254167 A1 WO2021254167 A1 WO 2021254167A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
connection point
user
data entity
components
Prior art date
Application number
PCT/CN2021/098215
Other languages
English (en)
French (fr)
Inventor
胡绍平
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021254167A1 publication Critical patent/WO2021254167A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/36Software reuse
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/40Transformation of program code
    • G06F8/41Compilation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/70Software maintenance or management
    • G06F8/76Adapting program code to run in a different environment; Porting

Definitions

  • the invention relates to the field of electronic technology, in particular to an APP development platform, an APP development method and electronic equipment.
  • various electronic devices can install applications (applications, APPs). Developers need to develop various APPs for these electronic devices.
  • developers need to write a set of program codes for electronic devices with different operating systems or different device forms.
  • developers need to write a set of program codes for Android phones, iOS phones, computers, etc., and the code written by developers for one electronic device cannot be directly reused in another. In electronic devices with different operating systems.
  • This application provides an APP development platform, APP development method, and electronic equipment.
  • users can select components in the APP development platform to connect, and the APP development platform can verify whether the components selected by the user can be connected.
  • the user can choose to compile the successfully connected component into the executable source code of the APP in the APP development platform. In this way, users can quickly develop APP.
  • this application provides an APP development platform that is applied to electronic devices.
  • the APP development platform includes a component toolbox, a component layout designer, and a code generation engine.
  • the component toolbox is used to provide components.
  • the components are independent modules that realize specific functions.
  • the components are composed of a component body and one or more connection points.
  • the connection points support one or more data entity types; the component layout designer is used for
  • the display component connects two or more components according to the operation of the user connection component; the code generation engine is used to generate the executable source code of the first APP from the two or more components connected in the component layout designer.
  • the first APP includes Two or more components.
  • the data entity is the data that the connection point can support.
  • the data entity type is the type of the data entity.
  • Data entity types can include audio, video, image, text, and other types.
  • the Component Orchestration Designer can create components and display the composition of components.
  • a component can consist of a component body and one or more connection point components.
  • the user connects the first component and the second component in the interface of the component orchestration designer; in response to this operation, the component design orchestrator connects the first component and the second component; the user selects the compiled and connected component in the component orchestration designer,
  • the code generation engine compiles the multiple connected components displayed in the component layout designer into executable code of the APP.
  • the component orchestration designer is also used to: in response to the user's operation of connecting the first component and the second component, verify whether the first component and the second component match; if the first component and the second component If they match, the first connection point and the second connection point are connected, the first connection point is the connection point of the first component, and the second connection point is the connection point of the second component. In this way, the data format mismatch between the first component and the second component can be avoided.
  • the first component and the second component matching include: the first data entity type is the same as the second data entity type, the first data entity type includes the second data entity type, or the second data entity type It includes a first data entity type, where the first data entity type is a type of data entity supported by the first connection point, and the second data entity is a type of data entity supported by the second connection point.
  • the component toolbox is also used to display the name of the component uploaded or downloaded by the user in response to the user's operation of uploading the component or downloading the component from the component market.
  • the APP platform can provide users with more components and improve user experience.
  • the component layout designer is specifically used to display a connection line connecting the first connection point and the second connection point. In this way, the user can be prompted that the first connection point and the second connection point are successfully connected.
  • the component layout designer is specifically used to display the first connection point and the second connection point in an overlapping manner. In this way, the user can be prompted that the first connection point and the second connection point are successfully connected.
  • the component layout designer is specifically used to display the first component according to the user's operation of selecting the first component from the component toolbox.
  • the component orchestration designer is also used to display the component call tree of the first component in response to the user's selection of the intelligent orchestration operation of the first component; the component call tree is used to display and the first component.
  • the second component and/or the third component matching the component, the fourth component and/or the fifth component matching the second component, and the Nth component matching the Mth component, and the Nth component is no output connection point , Where M and N are positive integers.
  • displaying the component call tree of the first component is specifically: displaying the first component according to the function of the first component and/or the data entity type supported by the connection point of the first component The component call tree. In this way, the component matching the first component can be more accurately recommended.
  • the component orchestration designer is also used to delete components in the component call tree in response to the user's operation to delete the components.
  • the component orchestration designer is also used to: save the orchestration model diagrams of two or more components that are connected, and the first information in the orchestration model diagram; the first information includes two or One or more of the IDs and names of multiple components, and the data entity types supported by the connection point of two or more components.
  • the code generation engine is specifically used to generate the executable source code of the first APP according to the layout model diagram, the first information and the component calling template, and the component calling template includes the program code in a preset format.
  • the component call template is a preset code template according to different types of components and the attributes of the connection point, which encapsulates the common interface call code logic.
  • the present application provides an APP development method.
  • the method includes: in response to a user's operation of selecting a component from the component toolbox of the electronic device, the electronic device displays the component in the component layout designer, and the component is An independent module with a specific function; the component is composed of a component body and one or more connection points, the connection point supports one or more data entity types; in response to the user's operation to connect multiple components, the electronic device is in the component layout designer Two or more components are connected; in response to the user's operation of selecting two or more components to be compiled, the electronic device generates the executable source code of the first APP from the two or more connected components in the code generation engine.
  • the data entity is the data that the connection point can support.
  • the data entity type is the type of the data entity.
  • Data entity types can include audio, video, image, text, and other types.
  • the user can use the existing components to connect to the APP without rewriting the code of the APP, which can save the user's time to develop the APP.
  • the electronic device connects two or more components in the component orchestration designer, which specifically includes: in response to the user's operation of connecting the first component and the second component, the electronic device verifies through the component orchestration designer Whether the first component and the second component match; if the first component and the second component match, the electronic device is connected to the first connection point and the second connection point, the first connection point is the connection point of the first component, and the second connection point It is the connection point of the second component. In this way, it can be ensured that the two components connected by the user are matched.
  • the matching of the first component and the second component includes: the first data entity type is the same as the second data entity type, the first data entity type includes the second data entity type, and the second data entity type includes The first data entity type, the first data entity type is the type of data entity supported by the first connection point, and the second data entity is the type of data entity supported by the second connection point.
  • the method further includes: in response to the user's operation of viewing the attributes of the first connection point of the first component, the electronic device displays the data entity type supported by the first connection point in the component layout designer. In this way, the user can know the attribute of the first connection point of the first component to facilitate subsequent operations of the user, for example, searching for a second component that matches the first component according to the attribute of the first connection point.
  • connecting the electronic device to the first connection point and the second connection point includes: displaying a connection line connecting the first connection point and the second connection point; or, combining the first connection point and the second connection point overlapping. In this way, the user can be prompted that the connection between the first connection point and the second connection point has been established.
  • the method further includes: in response to the user choosing to perform an intelligent orchestration operation on the first component, the electronic device displays the component call tree of the first component in the component orchestration designer; the component call tree is used To show the second component and/or third component matching the first component, and the fourth component and/or fifth component matching the second component, up to the Nth component matching the Mth component, and the Nth component is Components without output connection points, where M and N are positive integers.
  • the electronic device displays the component call tree of the first component in the component layout designer, which specifically includes: according to the function of the first component and/or the data entity type supported by the connection point of the first component , Showing the component call tree of the first component. In this way, the component matching the first component can be more accurately recommended.
  • the method further includes: in response to the user's operation of deleting the component, the electronic device deletes the component and calls the component of the tree species. In this way, the user can delete the components that are not related to the first APP in the component call tree.
  • the method further includes: the electronic device saves in the component orchestration designer the orchestration model diagrams of the two or more connected components, and the first information in the orchestration model diagram; the first information Including one or more of the ID and name of two or more components, and the data entity types supported by the connection point of the two or more components.
  • the method further includes: in the code generation engine, the electronic device generates the executable source code of the first APP according to the layout model diagram, the first information, and the component calling template, and the component calling template Including pre-formatted program code.
  • the component call template is a preset code template according to different types of components and the attributes of the connection point, which encapsulates the common interface call code logic.
  • this application provides an electronic device, including: one or more processors, one or more memories; the one or more memories are respectively coupled with one or more processors; the one or more memories It is used to store computer program code, and the computer program code includes computer instructions; when the computer instructions run on the processor, the electronic device causes the electronic device to execute the APP development method in any one of the possible implementation manners of any of the foregoing aspects.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, which when the computer instructions run on an electronic device, cause the communication device to execute the APP development method in any one of the possible implementations of any of the above aspects .
  • the embodiments of the present application provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the APP development method in any one of the possible implementations of any of the above aspects.
  • FIG. 1A is a schematic diagram of a video decoding component provided by an embodiment of this application.
  • FIG. 1B is a schematic diagram of a composite component provided by an embodiment of this application.
  • FIG. 2 is a schematic diagram of the architecture of an APP development platform 10 provided by an embodiment of the application;
  • FIG. 3 is a schematic diagram of a component toolbox 101 provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram of a user interface provided by an embodiment of the application.
  • 5A-5G are schematic diagrams of user interfaces provided by embodiments of this application.
  • FIGS. 6A-6B are schematic diagrams of user interfaces provided by embodiments of this application.
  • FIGS. 7A-7B are schematic diagrams of a user interface provided by an embodiment of the application.
  • FIGS. 8A-8D are schematic diagrams of a user interface provided by an embodiment of this application.
  • FIG. 9 is a schematic flowchart of an APP development method provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of an application scenario provided by an embodiment of this application.
  • FIG. 11 is a schematic diagram of a component development ecology provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of the corresponding relationship between the domain description language of the component and the component graph provided by an embodiment of the application;
  • FIG. 13 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the application.
  • FIG. 14 is a schematic block diagram of an electronic device according to an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as implying or implying relative importance or implicitly specifying the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “multiple” The meaning is two or more. In addition, the terms “including” and “having” and any variations thereof mentioned in the description of the present application are intended to cover non-exclusive inclusions.
  • a process, method, system, product, or device that includes a series of steps or units is not limited to the listed steps or units, but optionally includes other steps or units that are not listed, or optionally also Including other steps or units inherent to these processes, methods, products or equipment.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present application should not be construed as being more preferable or advantageous than other embodiments or design solutions.
  • words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • a component is an independent module composed of certain business logic and related data. Components can perform specific functions. For example, the video playback component can complete the function of playing the video, and the video decoding component can complete the function of decoding the video.
  • An APP can be composed of one or more components. Components can have a user interface or no user interface. For example, the video decoding component may not have a user interface, that is, the process of decoding the video by the electronic device through the video decoding component may not be displayed in the user interface, and the user may not perceive it.
  • a single component can run independently in the user equipment. For example, a video playing component can be installed in the TV, and then the TV can play the video through the playing video component.
  • Each component includes a component body and several connection points (input connection points or output connection points).
  • a component communicates data with other components through connection points.
  • the component includes at least one input connection point or output connection point. It is understandable that some components may also have no connection points, that is, such components have no input connection points and output connection points.
  • the components that can be used to form the APP have at least one or more connection points.
  • the composition of a component may be as shown in FIG. 1A.
  • Figure 1A shows a graphical example of a video decoding component.
  • the video decoding component can include a component body (decode video 10a in Figure 1A), an input connection point (Input ConnectPoint10b in Figure 1A), and two output connection points (Output ConnectPoint10c in Figure 1A) And output connection point 10d).
  • connection point is the external intelligent interface of the component, that is, the agent that realizes the input or output function, and is responsible for interface protocol inspection, negotiation and docking, data transmission, etc. with other connection points.
  • a connection point is a way for a component to receive input data from another component, and it is also a way for a component to output data to another component.
  • Connection points include input connection points or output connection points.
  • each connection point can support one or more types of data entities Entity.
  • the data entities supported by the connection point can be images, audio, video, text, and so on.
  • Data entity that is, the data that the connection point can support.
  • the type of data entity is the type of data entity.
  • Table 1 exemplarily shows some types of data entities.
  • videos can be classified according to different video coding standards.
  • a video coding standard may include a moving pictures experts group (moving pictures experts group, MPEG) compression coding standard and a high efficiency video coding (high efficiency video coding, HEVC) standard.
  • MPEG compression coding standards can include several coding standards such as MPEG, MPEG2 and MPEG4.
  • HEVC may include the coding standard H265. Audio files can have multiple formats.
  • moving picture experts compress standard audio layer 3 (moving picture experts group audio layer III, referred to as MP3) format audio files, wave file (waveform audio, referred to as WAVE) format audio files, lossless audio compression coding (Free Lossless Audio Codec, FLAC) format audio files.
  • the types of data entities can also include text, pictures, compressed files, documents, streams, data sets, and so on.
  • Streams can include video streams, audio streams, or composite streams.
  • the composite stream can contain video and audio streams.
  • the data set can include database tables. Among them, each type can also include some subtypes, formats or some other attributes.
  • the video type data entity type can be sub-types such as MPEG, MPEG2, and MPEG4, and can also include attributes such as width, height, subtitles, language, and so on.
  • the data entity type supported by the input connection point 10b in FIG. 1A may be MPEG format video, the format is 1920*1080, the language is Chinese, and so on.
  • the composite component is composed of multiple components and can perform multiple functions.
  • the composite component 100A may be composed of a video decoding component and a video playback component.
  • the video decoding component includes a main body (decoded video 10a), an input connection point (input connection point 10b) and two output connection points (output connection point 10c and output connection point 10d).
  • the video playback component includes a main body (play video 10e) and an input connection point (input connection point 10f).
  • the output connection point 10d of the video decoding component is connected to the input connection point 10f of the video playback component.
  • the composite component 100A can realize the functions of video decoding and video playback.
  • developers can use the "shortcut instruction" application on the electronic device to form a new APP from existing shortcut instructions or custom shortcut instructions. That is, developers can combine multiple operations between multiple applications in the terminal through the "shortcut instruction" application to create an APP.
  • application A in the electronic device has the operation of taking pictures.
  • Application B in the electronic device has the operation of converting a photo into a PDF document with one click. Developers can use "shortcut instructions" to combine the camera operation in Application A and the one-click conversion of photos into PDF documents in Application B into a new APP. In this way, developers can develop a new APP more quickly.
  • the method proposed in this implementation mode can only simply combine some operations in the electronic device.
  • developers need to develop an APP with more functions and more complex business logic the methods proposed in the above implementations are difficult to meet the needs of developers.
  • this application provides an APP development platform.
  • the APP development platform includes component toolbox, component orchestration designer and code generation engine.
  • this application proposes an APP development method.
  • the method includes: the electronic device detects a user's operation of selecting a component from the component toolbox, and an APP development platform is installed in the electronic device.
  • the component orchestration designer can create a component and display the composition of the component.
  • the component can consist of a component body and one or more connection point components.
  • the electronic device detects the user's operation of connecting the first component and the second component in the interface of the component orchestration designer; in response to this operation, the component design orchestrator determines the data entity type and the second lease supported by the output connection point of the first component. Whether the data entity type supported by the input connection point of the file matches. If it matches, the component designer orchestrator can display the indicator of successful connection.
  • the electronic device detects that the user selects the operation of compiling the connected components in the component layout designer, and in response to this operation, the code generation engine compiles the multiple connected components displayed in the component layout designer into executable code of the APP. In this way, developers can quickly combine multiple existing components to form an APP that users need to develop, without writing program codes one by one to realize the functional logic of the APP.
  • the developer may also be referred to as a user, and the user may develop an APP or component in the electronic device provided in the present application.
  • FIG. 2 shows a schematic diagram of the architecture of the APP development platform 10 provided by an embodiment of the present application.
  • the APP development platform 10 provided by the embodiment of the present application includes: a component toolbox 101, a component orchestration designer 102, and a code generation engine 103.
  • the component toolbox 101 is used to present components.
  • the components in the component toolbox 101 can be classified according to their functions. Users can download components from the component market and save them in the component toolbox 101. Users can also upload components developed and designed by themselves to the component market.
  • the component toolbox 101 may be as shown in FIG. 3, and FIG. 3 shows the component toolbox 101 provided by an embodiment of the present application.
  • the component toolbox 101 may be displayed in a display area 1000 of a user interface (not shown).
  • the display area 1000 may include a control 1001, a control 1002, a control 1003, and a control 1004.
  • the user can search for components in the component toolbox 101 through the control 1001.
  • the control 1002 and the control 1003 are used to expand or collapse a certain type of component.
  • the control 1002 in FIG. 3 is used to expand or collapse components of common component classes.
  • the control 1003 is used to expand or collapse the components of the audio-visual playback category.
  • FIG. 3 shows the component toolbox 101 provided by an embodiment of the present application.
  • the component toolbox 101 may be displayed in a display area 1000 of a user interface (not shown).
  • the display area 1000 may include a control 1001, a control 1002, a control 1003, and a control 1004.
  • the user can search
  • the component toolbox 101 includes components of common component types and components of audiovisual playback type. It can be understood that the components in the component toolbox 101 are not limited to commonly used components and audiovisual playback components.
  • the component toolbox 101 may also contain other types of components, for example, document processing components, image processing components, and so on. It is understandable that the embodiment of the present application does not limit the specific user interface of the component toolbox 101.
  • the user interface of the component toolbox 101 may have more or fewer controls than in FIG. 3.
  • users can classify components according to their own usage habits. For example, users can add playing video components to the classification of commonly used components. The user can classify the ticket reservation component, the hotel reservation component, and the food delivery order into the reservation payment component, and the specific category name can be defined by the user. It is understandable that the component classifications in the component list in the component toolbox 101 of different users may be different. For example, the components of common component classes (such as sending short message component, dialing component, etc.), video and audio playback class (such as playing video component, decoding video component, decoding audio component, etc.) shown in FIG. 3, and so on. Regarding how to classify components, the embodiment of the present application does not limit it. The embodiments of the present application are described below by taking as an example the classification of components according to their functions.
  • the component layout designer 102 is the core tool for component layout and development. Specifically, the developer can select, layout and connect multiple components in the layout designer, select the connection point data entity type, set business logic, and form a new Composite components, etc.
  • the Component Orchestration Designer can be specifically used for:
  • the component orchestration designer 102 responds to the user dragging a component from the component toolbox 101 to the component orchestration designer 102, and the component orchestration designer 102 can present the component to the user.
  • the developer can select a component in the component toolbox 101, and then the component layout designer 102 reads the file of the component.
  • the component layout designer 102 obtains the composition of the component according to the file of the component, draws the component, and presents it in the component layout designer 102.
  • Connection point verification Verify that the data entity types of the connection points of the two components are the same.
  • Component connection Connect two or more components according to user operations.
  • the developer can connect two or more components in the component orchestration designer 102 according to the business logic of the required APP or the business logic that conforms to the component.
  • the component orchestration designer 102 can also verify whether the data types supported by the connection points of the two components are consistent. For example, when the output connection point of the video playback component and the input connection point of the video decoding component support the same data entity type, the connection point between the two components can be connected.
  • the component orchestration designer 102 can automatically associate and recommend other components that can be docked with the current component according to the function of the component and the data entity type supported by the component connection point for developers to choose; or automatically generate a docking orchestration according to the orchestration strategy Model.
  • the component layout designer 102 may display all components that can be connected to the connection point of the first component according to user operations. The user can select the required component among the components displayed by the component layout designer 102.
  • the component layout designer 102 can display the data types supported by the connection point in response to the user's operation of viewing the data types supported by the connection point.
  • the component orchestration designer 102 can also be used to save a plurality of component orchestration model diagrams and all the information in the model diagram. For example, the IDs and names of all components in the model diagram, the data entity types of all connection points, the connection attributes of the connection points (including the data entity types supported by the two connected connection points, and the data transmission method (direct transmission or judgment conditions are required) , Transmit only when the judgment condition is met)).
  • the code generation engine 103 is used to generate executable source codes from the arranged composite components and APPs. Specifically, the code generation engine 103 arranges the completed model diagrams of multiple components stored in the component layout designer 102 and all the information in the model diagrams. For example, the ID and name of all components in the model diagram, the data entity types of all connection points, the connection attributes of the connection points, and the component call template to generate executable source code.
  • the component calling template is a code template preset according to different types of components and the attributes of the connection point, encapsulating the general interface calling code logic.
  • the component calling template may include the following program code:
  • the code generation engine can replace the content of ⁇ > in the above code according to the specific generated APP.
  • "ComposePlayer” in ⁇ ComposePlayer> can be replaced with the name of a specific APP (such as 123player).
  • the "VideoSplit” in ⁇ VideoSplit> can be replaced with the component name required to actually generate the APP.
  • "ConnectPoint1" in ⁇ ConnectPoint1> can be replaced with the actual connection point.
  • “MPEG” and "640_480” in ⁇ entity.MPEG>, ⁇ entity.640_480> can be replaced with actual data entity types.
  • the APP development platform in this application can be used as a tool for developing an APP alone, or it can be a module in a development tool. There is no limitation here.
  • Figures 4 to 8D exemplarily show the process of developing a video APP.
  • the embodiments of this application are described by taking the electronic device as a computer as an example.
  • FIG. 4 shows a user interface 400.
  • the user interface 400 is used for the user to create an APP.
  • the user interface 400 may include an input box 401, an input box 402, an input box 403, an input box 405, and a control 404, a control 406, a control 407, a control 408, a control 409, and a control 410.
  • the input box 401 is used to input the name of the project created by the user, that is, the name of the APP, such as "MediaPlayerAPP".
  • the input box 402 is used for the user to input the package name (Package Name) of the APP created by the user, for example, "com.ex.example.mediaplayerapp".
  • the package name is the unique identifier of the APP and is mainly used for the system to identify the APP.
  • the input box 403 is used for the user to select or input the storage location of the created APP.
  • the user can directly input the storage location of the created APP in the input box 403, for example, "D: ⁇ test ⁇ mediaplayerapp".
  • the user can also select the storage location of the created APP through the control 404 in the input box 403.
  • the input box 405 is used for the user to select or directly input the minimum API version supported by the created APP.
  • the user can directly input the minimum API version supported by the created APP in the output box 405, for example, "xxsdk:1.0.0".
  • the user can also select the lowest API version supported by the created APP through the control 406.
  • the control 407 is used to guide the user how to operate in the user interface 400.
  • the control 408 is used to cancel the item created by the user.
  • the control 409 is used to return to the previous step of operation.
  • the control 410 is used to save items created by the user and refresh the user interface 400. After the user fills in the content in the input box 401, the input box 402, the input box 403, and the input box 405, the user clicks the control 410.
  • the user interface 50A in response to the user's operation of clicking the control 410, the user interface 50A as shown in FIG. 5A is displayed.
  • the user interface 50A may include a display area 501 and a display area 502.
  • the display area 501 is used to display the component toolbox 101.
  • the display area 502 can display the graphics of the component.
  • the display area 502 may be the interface of the component design composer 102.
  • the interface of the component design orchestrator 102 can be used to display component graphics.
  • the component layout designer 102 can create the decomposed video component, and display the The main body of the disassembled video component (for example, disassembled video 503 in user interface 50A) and connection points (for example, input connection point 504, output connection point 505, and output connection point 506 in user interface 50A) are drawn in area 502.
  • the user can view the data entity types supported by the component connection point.
  • the electronic device can detect the user's operation of viewing the connection point, and display the data entity type supported by the component connection for the user. There can be many kinds of operations, such as double-clicking the connection point, hovering the mouse cursor over the connection point for 2 seconds, or right-clicking the connection point with the mouse, etc., which are not limited here.
  • the user can right-click the output connection point 506, and the electronic device detects the user operation, and displays the user interface 50C in FIG. 5C.
  • the user interface 50C can display the viewing control 507 and the smart layout control 508.
  • the view control 507 is used to display the data entity types supported by the output connection point 506.
  • the smart layout control 508 can be used for smart layout, displaying components that can be connected to the output connection point 506. Exemplarily, the user can click on the view control 507.
  • the user interface 50D as shown in FIG. 5D is displayed, and the data entity attribute table 509 supported by the output connection point 506 may be displayed in the user interface 50D.
  • the data entity attribute table 509 can be used to display that the data type of the data entity supported by the output connection point 506 is video, the subtype of the video can be MPEG and MPEG4, and the format of the video can be 640*480, 1920*1080, and so on.
  • the user can also click the component body decomposition video 503 of the decomposition video component in the user interface 50D to view the data entity types supported by all the connection points of the decomposition video component.
  • the way to view the data entity types supported by all the connection points of the disassembled video component can be double-click the connection point, hover the mouse cursor over the connection point for 2s, or right-click the connection point, etc. There is no restriction here.
  • the electronic device in response to the user's operation of clicking the component main body of the disassembled video component to disassemble the video 503, the electronic device may display a user interface 50E as shown in FIG. 5E.
  • the user interface 50E may display the attribute table 510 of the disassembled video component.
  • the attribute table 510 is used to display the attributes of all connection points of the disassembled video component.
  • the connection point 1 in the attribute table 510 may be the input connection point 504, the connection point 2 may be the output connection point 505, and the connection point 3 may be the output connection point 506.
  • Connection point 1 can contain two types of data entities, namely data entity 0 and data entity 1.
  • the control 511 is used to hide the data entity of the connection point. That is, when the user clicks the control 511, the data entity 0, data entity 1 and data entity displayed in the attribute table 510 are hidden.
  • the control 512 is used to hide the attributes of the data entity 0, such as category, subcategory, width, height, bitrate, and so on.
  • the control 513 is used to expand and display the attributes of the data entity 1.
  • the control 514 is used to expand and display the data entities supported by the connection point 2.
  • the control 515 is used to expand and display the data entities supported by the connection point 3.
  • the user can select one data entity and set it as a data entity that the connection point can support.
  • the output connection point 506 supports multiple types of data entities.
  • the data entity supported by the output connection point 506 may be video data with a subtype of MPEG and a format of 640*480, and video data with a subtype of MPEG and a format of 1920*1080, and a subtype of MPEG4 with a format of 640*.
  • 480 video data, and the file type is MPEG4, the format is 1920*1080 video data.
  • the user can set MPEG4 in the data entity attribute table 509 as a subtype that the output connection point 506 can support.
  • the electronic device can detect the operation of the user setting the connection point to support the attribute of the data entity. There may be multiple operations for the user to set the connection point to support the attributes of the data entity. For example, the user can double-click the subtype MPEG4 in the data entity attribute table 509. There is no limitation on the operation of the user to set the connection point to support the attributes of the data entity.
  • the electronic device may display a prompt box 516.
  • the prompt box 516 is used to prompt the user that the data entity has been set.
  • the prompt content of the prompt box 516 may be the text content shown in FIG. 5E "You have set the subtype of the connection point to MPEG4", and the specific prompt content of the prompt box 516 is not limited here.
  • the user can set the data entity supported by the connection point in the attribute table 510 shown in FIG. 5E.
  • the user can double-click the data entity 0 in the attribute table 510, and in response to a user operation, the component layout designer 102 sets the data entity 0 as the only data entity supported by the connection point 1.
  • the user can sequentially set the subtype and format of the data entity (for example, video) supported by the output connection point 506. For example, the user can set the subtype of the video to MPEG4 and the format to 1920*1080.
  • the user can click the view control 507 again to view the attributes of the data entities supported by the output connection point 506.
  • the user interface 50G displays the updated data entity attribute table 511.
  • the subtype and format in the data entity attribute table 511 are all set by the user, that is, the subtype is MPEG4, and the format is set to 1920*1080.
  • the user can add other components in the component toolbox 101 to the component layout designer 102 according to the above method.
  • the user can drag the play video component in the component toolbox 101 to the component layout designer 102.
  • the electronic device can detect the operation of the user dragging and playing the video component, and in response to this operation, the component layout designer 102 can create the video playback component and draw the component body of the video playback component (such as the playback video 601 shown in the user interface 60A) And the connection point (for example, the input connection point 602 shown in the user interface 60A), and is displayed in the interface of the component layout designer 102 (for example, the user interface 60A).
  • the user can hide the component toolbox. And the user can connect the two components in the interface of the component layout designer 102.
  • a user interface 60A as shown in FIG. 6A.
  • the electronic device can detect the user's operation of connecting the disassembled video component and the playing video component. There are many operations for the user to connect the two components. For example, the user can drag the output connection point 506 to the input connection point 602, or the user can also drag the input connection point 602 to the output connection point 506. This is not done here. limited.
  • the component layout designer 102 obtains the data entity types supported by the output connection point 506 and the input connection point 602.
  • the component layout designer 102 determines whether the data entity type supported by the output connection point 506 and the data entity type supported by the input connection point 602 match.
  • the matching of the data entity type supported by the output connection point 506 and the data entity type supported by the input connection point 602 means that the data entity type supported by the connection point 506 is the same as the data entity type supported by the input connection point 602, or the output connection point 506 supports At least one of the multiple data entity types in is the same as the data entity type supported by the input connection point 602. If they match, the interface of the component layout designer 102 may display a successful connection identifier. If it does not match, the interface of the component layout designer 102 may prompt the user that it does not match (not shown in the figure).
  • connection points of the two components do not match.
  • the prompt text “Unable to connect”, “Matching failed”, and “Connection failed” are displayed in the user interface. ", "Data entity types are inconsistent”, etc., which are not limited here.
  • the orchestrator can connect the input connection point and the output connection point according to the two connection points supporting the same data entity type.
  • connection point when the connection point supports multiple entity types, the user can first set the data entity type supported by the output connection point 506 and the data entity type supported by the input connection point 602 to be the same Type, and then connect the output connection point 506 and the input connection point 602. At this time, since the data entity types supported by the output connection point 506 and the input connection point 602 are the same, the component layout designer 102 can connect the output connection point 506 and the input connection point 602.
  • FIG. 6A shows a user interface 60A provided by an embodiment of the present application.
  • the output connection point 506 of the disassembled video component in the user interface 60A and the input connection point 602 of the playing video are successfully connected.
  • the user interface after the connection of the two components is completed may also be as shown in FIG. 6B.
  • the output connection point 506 of the disassembled video component in the user interface 60B shown in FIG. 6B and the input connection point 602 for playing the video are successfully connected.
  • the output connection point 506 and the input connection point 602 are folded or overlapped together, which means that the output connection point 506 and the input connection point 602 are successfully connected.
  • FIG. 7A shows a user interface 700 provided by this application.
  • the user interface 700 displays all the components that make up the APP created by the user (that is, the video playback APP).
  • All components of the video playback APP may include a read video file component 702, a disassembled video component 705, a voice translation component 710, a high-definition enhancement component 711, an audio playback component 716, and a video playback component 717.
  • the component for reading a video file may include an input connection point 701 and an output connection point 703.
  • the disassembled video component 705 may include an input connection point 704, an output connection point 706, and an output connection point 707.
  • the speech translation component 710 may include an input connection point 709 and an output connection point 712.
  • the HD enhancement component 711 may include an input connection point 708 and an output connection point 713.
  • the audio playback component 716 may include an input connection point 714.
  • the video playback component 717 may include an input connection point 715.
  • the output connection point 703 and the input connection point 704 are connected.
  • the output connection point 707 is connected to the input connection point 709 or the input connection point 714.
  • the output connection point 706 may be connected to the input connection point 708 or the input connection point 715.
  • the output connection point 712 may be connected to the input connection point 714.
  • the output connection point 713 may be connected to the input connection point 715.
  • the user can set attributes for the connection lines between the connection points.
  • the attributes of the connecting line can include logic control attributes (conditional judgment/branch/loop) or data transfer attributes (entity input and output).
  • condition judgment/branch/loop condition judgment/branch/loop
  • data transfer attributes entity input and output
  • FIG. 7A for the output connection point 707 of the disassembled video component 705, the language attribute can be judged. If it is consistent with the local language, the speech translation component 710 can be skipped and directly connected to the input connection point 714 of the sound playback component. That is, the user can set a judgment condition for the connection point 707, and the condition judges whether the language supported by the connection point 707 is consistent with the local language of the electronic device.
  • the output of the video decomposition component is input to the input connection point 714 of the audio playback component through the output connection point 707. If not, the output of the video decomposition component is input to the input connection point 709 of the language translation component 710 through the output connection point 707.
  • the user can click the connection point on the left end of the connection line to set the connection line attributes.
  • the user can click the connection point 707 to set the control attributes of the connection point 707 to the connection point 714 and the connection line between the connection point 707 and the connection point 709.
  • the interface for setting the control attribute of the connecting line may be as shown in the user interface 70B in FIG. 7B.
  • the user interface 70B may include a condition setting box 720, a condition setting box 722, a condition setting box 724, a condition judgment result selection box 725, a control 721, a control 723, a control 726, a control 727, a control 728, and a control 729.
  • the user interface 70B may further include a condition setting box 730, a condition setting box 732, a condition setting box 734, a condition judgment result selection box 735, a control 731, a control 733, a control 736, and a control 737.
  • the condition setting box 720 is used to set the judgment subject, such as language.
  • the control 721 is used to select the judgment condition.
  • the condition setting box 722 is used to set the relationship between the judgment subject and the comparison subject.
  • the judgment subject may include the comparison subject.
  • the control 723 is used to select the relationship between the judgment subject and the comparison subject.
  • the relationship may be inclusive or not inclusive, which is not limited here.
  • the condition setting box 724 is used to set the comparison subject, such as a local language.
  • the condition setting box 720, the condition setting box 722, and the condition setting box 724 jointly complete the judgment condition setting.
  • the judgment condition 1 shown in the user interface 70B is "the language of the connection point 707 includes the local language".
  • the condition judgment result selection box 725 is used to set the condition judgment result. For example, if the condition “the language of the connection point 707 includes the local language” is satisfied, the data entity of the connection point 707 is transmitted to the connection point 714 through the connection line between the connection point 707 and the connection point 714.
  • the control 726 is used to select the condition judgment result.
  • the control 727 is used to add conditions.
  • the control 728 is used to complete the condition setting.
  • the control 729 is used to cancel the condition setting.
  • the control 737 is used to return to the previous interface.
  • the condition setting box 730 is used to set the judgment subject, such as language.
  • the control 731 is used to select the judgment condition.
  • the condition setting box 732 is used to set the relationship between the judgment subject and the comparison subject.
  • the control 733 is used to select the relationship between the judgment subject and the comparison subject.
  • the condition setting box 734 is used to set the comparison subject, such as a local language.
  • the condition setting box 730, the condition setting box 732, and the condition setting box 734 jointly complete the judgment condition setting.
  • the judgment condition 2 shown in the user interface 70B is "the language of the connection point 707 does not include the local language”.
  • the condition judgment result selection box 735 is used to set the condition judgment result. For example, if the condition “the language of the connection point 707 does not include a local language” is satisfied, the data entity of the connection point 707 is transmitted to the connection point 709 through the connection line between the connection point 707 and the connection point 709.
  • the control 736 is used to select the condition judgment result.
  • the electronic device can detect the user's operation of compiling the connected components into executable code.
  • the user clicks the compile control 718 of the user interface 700, which is not limited here.
  • the code generation engine 103 compiles it into the executable program code of the APP created by the user.
  • the code generation engine 103 may obtain the IDs and names of the multiple components that have been orchestrated in the component orchestration designer 102, as well as connection points and connection attributes between the connection points.
  • the output connection point 506 of the decomposition component and the input connection point 602 of the playback video component shown in FIG. 6A the entity type of the data transmitted between the two connection points can be video data with a file type of MPEG4 and a format of 1920*1080 . Data is directly transferred between these two connection points, without judgment or circular logic.
  • the code generation engine 103 generates executable code according to the obtained component information, the connection attributes between the connection points and the corresponding component call template.
  • the component calling template is a code template preset according to different types of components and the attributes of the connection point, which encapsulates the general interface calling code logic. It is understandable that the calling template of the video component and the audio component can be different. Whether data is directly transmitted between two connection points, or a judgment condition is required, and the data can be transmitted only when the judgment condition is satisfied, the component call templates corresponding to different data transmission methods can also be different. For details, please refer to the introduction to the component calling template above. The specific code in the component calling template is not limited here.
  • the electronic device receives a user operation for displaying the component call tree of the first component, and in response to the user operation, the component orchestration designer 102 starts the intelligent orchestration function; based on the intelligent orchestration function, the component orchestration designer 102
  • the interface can display the component call tree of the first component; the component call tree is used to display all the second and/or third components that match the first component, and the fourth and/or fifth components that match the second component
  • the component up to the Nth component that matches the Mth component, and the Nth component is a component with no output connection point, where M and N are positive integers.
  • the component layout designer 102 finds a matching second component and/or third component according to the data entity type supported by the output connection point of the first component.
  • the data entity type supported by the input connection point of the second component and the third component matches the data entity type supported by the output connection point of the first component.
  • the component layout designer 102 searches for the fourth component and/or the fifth component that matches the second component according to the data entity type supported by the output connection point of the second component in turn.
  • the data entity type supported by the input connection point of the fourth component and the fifth component matches the data entity type supported by the output connection point of the second component.
  • the component orchestration designer 102 will find components that match each component in the component call tree until the last level component of the component call tree is a non-output connection point.
  • the components in the component call tree can implement one or more functions (for example, video decoding function, video playback function, etc.), which is not limited here.
  • the electronic device displays the user interface 80A as shown in FIG. 8A, and the component call tree of the component 1 in the user interface 80A.
  • Component 1 can be connected to Component 2 and Component 3. That is, component 1 and component 2 match, and component 1 and component 3 match.
  • the component 2 can be connected to the component 4 and the component 5. That is, component 2 and component 4 are matched, and component 2 and component 5 are matched.
  • the component 3 can be connected to the component m, that is, the component 3 matches the component m.
  • Component 4, component 5, and component m can be connected to component n. That is, the component 4 matches the component n, the component 5 matches the component n, and the component m matches the component n.
  • the user can select some components in the component call tree, for example, select component 2 and component 4, and component n. Then remove or delete other unnecessary components. For example, the user right-clicks and double-clicks the component to delete the component, and the method of removal or deletion is not limited.
  • Table 2 exemplarily shows the data entity types supported by the connection points of component 1, component 2, component 3, and component m.
  • “MPEG4, 1920*1080, Chinese” in Table 2 indicates that the data entity supported by the connection point is video data in the format of MPEG4, the video data resolution is 1920*1080, and the language supported in the video data is Chinese.
  • “MP3, 112800, Chinese” means that the data entity supported by the connection point is audio data in MP3 format, the audio data adoption rate is 112800 Hz, and the supported language is Chinese. It can be seen from Table 2 that the data entity types supported by the input connection point of component 2 include the data entity types supported by the output connection point of component 1.
  • the output connection point of component 1 can be connected to the input connection point of component 2, and component 1 and component 2 are matched.
  • the data entity types supported by the input connection point of component 3 include the data entity types supported by the output connection point of component 1. Therefore, the output connection point of the component 1 can be connected to the input connection point of the component 3, and the component 1 and the component 3 are matched.
  • the data entity type supported by the input connection point of the component 4 includes the data entity type supported by the output connection point 2 of the component 2. Therefore, the input connection point of the component 4 can be connected to the output connection point 2 of the component 2, and the component 2 and the component 4 are matched.
  • the data entity type supported by the input connection point of the component 5 includes the data entity type supported by the output connection point 2 of the component 2.
  • the input connection point of the component 5 can be connected to the output connection point 2 of the component 3, and the component 5 and the component 2 are matched.
  • the data entity type supported by the input connection point of the component m includes the data entity type supported by the output connection point of the component 2. Therefore, the output connection point of the component m can be connected to the input connection point of the component 3, and the component 3 and the component m are matched.
  • the data entity types supported by the input connection point of component n include the data entity types supported by the output connection point of component 4, the output connection point of component 5, and the output connection point of component m. Therefore, the input connection point of component n can be connected to the output connection point of component 4, the output connection point of component 5, and the output connection point of component m.
  • the component n matches the component 4, the component 5, and the component m.
  • connection points of the various components shown in Table 2 are only examples.
  • the connection points of the various components shown in FIG. 8A can support any of the data entity types shown in Table 1.
  • the embodiment of the present application does not limit the data entity type that can be supported by the connection point of each component in FIG. 8A.
  • the user interface 80B may display an exploded video component, which may include a main body (exploded video 503) and an input connection point 504, an output connection point 505, and an output connection point 506.
  • the user does not know which components in the component toolbox 101 can be connected to the output connection point 506 of the disassembled video component.
  • the user can click on the smart layout control 508 in the user interface 80B.
  • the electronic device displays a user interface 80C as shown in FIG. 8C.
  • the user interface 80C displays a component call tree that disassembles the video components.
  • the component call tree of the decomposed video component may include a video playback component, and the video playback component includes a main body (play video 717) and an input connection point 715.
  • the component layout designer 102 can search for the component (such as a video playback component) that matches the output connection point in the component toolbox 101 according to the data entity type of the output connection point 506, and display it in the component layout In the interface of the designer 102 (for example, the user interface 80C).
  • the electronic device may also display a user interface 80D as shown in FIG. 8D.
  • the user interface 80D displays a component call tree that disassembles the video components.
  • the component call tree of the decomposed video component may include a high-definition enhancement component and a video playback component.
  • the HD enhancement component includes a component body (HD enhancement 711), an input connection point 708 and an output connection point 713.
  • the play video component includes a component main body (play video 717) and an input connection point 715.
  • the output connection point 506 may be connected to the input connection point 708.
  • the output connection point 713 may be connected to the input connection point 715.
  • the user can choose to disassemble the input connection point 504 of the video component for intelligent arrangement. For example, right-click the input connection point 504 and select the smart layout control.
  • the component layout designer 102 can display components that can be connected to the input connection point 504. For example, the read video component 702 shown in FIG. 7A.
  • FIG. 9 is a schematic flowchart of an APP development method provided by an embodiment of the application.
  • an APP development method provided by an embodiment of the present application specifically includes:
  • the electronic device creates the first APP in response to the user's operation of creating the first APP.
  • the electronic device can detect the user's operation of creating the first APP.
  • the user opens a user interface for creating an APP provided in an electronic device, and then creates an APP in the user interface.
  • For the user interface of creating an APP refer to the user interface 400 shown in FIG. 4.
  • For how the user creates the APP in the user interface please refer to the description of Figure 4 above. I won't repeat them here.
  • the electronic device In response to the user's operation of selecting a component from the component toolbox of the electronic device, the electronic device displays the component in the component layout designer.
  • the component is an independent module that realizes a specific function; the component is composed of a component body and one or more connection points. Composition, the connection point supports one or more data entity types.
  • the user can select a component from the component toolbox in various operations, for example, the user drags the component in the component toolbox, or the user double-clicks the component in the component toolbox. This user operation is not limited here.
  • connection point of the component supports one or more data entity types.
  • data entity type please refer to the introduction to Table 1 above, which will not be repeated here.
  • developers when developers are developing components, they can use program code to define the connection points of the components. For example, define the data entity types supported by the connection point in the component, and the functions contained in the connection point.
  • developers can define the connection points of all components to support the same multiple functions. For example, the create function used to create data entities, the connect function used to connect to other connection points, the accept function used to receive, the pull function used to store the received data locally, the push function used to send data, and so on.
  • the electronic device displaying graphics of multiple components specifically includes: responding to the user selecting a component from the component toolbox of the electronic device
  • the electronic device obtains the first file of the multiple components that make up the first APP, parses the first file and draws the graphics of the component; displays the graphics of the multiple components, and the first file is the program code describing the function and properties of the component.
  • the user's operation of selecting components from the component toolbox of the electronic device is used to select multiple components.
  • the graphics of the component are used to display the main body and connection points of the component.
  • the component includes a main body and one or more connection points, as shown in Figure 5A.
  • the user's operation of selecting a component from the component toolbox of the electronic device may be dragging the component in the component toolbox 101 to the component layout designer 102.
  • the user's operation of selecting a component from the component toolbox of the electronic device may also be clicking or double-clicking the selected component, etc.
  • the embodiment of the present application does not limit the second user operation.
  • the electronic device may receive a user operation of a user dragging a component from the toolbox.
  • the electronic device obtains the file of the component and parses the file of the component.
  • the electronic device draws the graphics of the component.
  • the electronic device can obtain the file of the component through the component layout designer 102 in the electronic device, parse the file of the component, and draw the graphic of the component.
  • a video playback APP that can support multiple languages
  • this APP can be composed of components that read video files, disassembled video components, voice translation components, high-definition enhancement components, audio playback components, video playback components, etc., as shown in Figure 7A
  • the components that make up the video APP After the user selects a component from the component toolbox 101, the electronic device obtains the first file of the component.
  • the electronic device can parse from the first file the number of connection points contained in the component, the attributes of the connection points, the attributes of the data entities that the connection points can transmit, and so on.
  • the electronic device draws the component into a visual component graph according to the number of connection points contained in the component parsed from the first file.
  • the user can see the component graphics in the user interface.
  • the component graphics of the exploded video component as shown in FIG. 5A.
  • the electronic device receives a user operation for displaying the component call tree of the first component, and in response to the user operation, the electronic device displays the component call tree of the first component; the component call tree is used to display All matching second and/or third components of the first component, and fourth and/or fifth components matching the second component, up to the Nth component matching the Mth component, and the Nth component is Components without output connection points.
  • the electronic device finds a matching second component and/or third component according to the data entity type supported by the output connection point of the first component. That is, the data entity type supported by the input connection point of the second component and the third component matches the data entity type supported by the output connection point of the first component.
  • the electronic device searches for a fourth component and/or a fifth component that matches the second component according to the data entity type supported by the output connection point of the second component in turn.
  • the data entity type supported by the input connection point of the fourth component and the fifth component matches the data entity type supported by the output connection point of the second component.
  • the electronic device will find components that match each component in the component call tree until the last level of the component call tree is a non-output connection point. For details, reference may be made to the description of the call tree of component 1 in FIG. 8A, which will not be repeated here.
  • the user can click the smart layout control 508.
  • the embodiment of the present application does not limit the user operation of displaying the component call tree.
  • the user can select the output connection point of the component 1, and then right-click to select smart layout.
  • the electronic device displays the component call tree of component 1 in response to the user's click operation of the smart arrangement. In this way, the user can quickly select the components needed to develop the APP in the component call tree. Therefore, the user's time can be saved and the user experience can be improved.
  • the electronic device displays the component call tree of the first component in the component orchestration designer, which specifically includes: the electronic device according to the function of the first component and/or the data entity supported by the connection point of the first component Type, the component call tree of the first component is displayed in the component layout designer.
  • the electronic device in response to the user's operation of deleting the component, deletes the component of the component calling tree species.
  • the electronic device in response to an operation of a user uploading a component or downloading a component from the component market, displays the name of the component uploaded by the user or downloaded from the component market in the component toolbox.
  • the electronic device in response to the user's operation of viewing the attributes of the first connection point of the first component, displays the data entity type supported by the first connection point in the component layout designer.
  • the electronic device In response to the user's operation of connecting multiple components, the electronic device connects two or more components in the component layout designer.
  • the electronic device can receive the user's operation of connecting multiple components, and the user's operation of connecting multiple components can have many kinds.
  • the operation of the user to connect multiple components may be that the user drags the output connection point of the second component to the input connection point of the first component.
  • the operation of connecting multiple components by the user may also be that the user slides the second component in the direction of the first component.
  • the user's operation of connecting multiple components may also be the user inputting the output connection point of the first component and the input connection point of the second component.
  • the embodiment of the present application does not limit the operation of the third user. Here, reference may be made to the above description of FIG. 6A, which will not be repeated here.
  • the electronic device connects two or more components in the component orchestration designer, which specifically includes: in response to the user's operation of connecting the first component and the second component, the electronic device verifies through the component orchestration designer Whether the first component and the second component match; if the first component and the second component match, the electronic device is connected to the first connection point and the second connection point, the first connection point is the connection point of the first component, and the second connection point It is the connection point of the second component.
  • the electronic device in response to a user's operation to connect multiple components, the electronic device will perform a process of establishing a connection between the first component and the second component. First, the electronic device obtains the data entity type supported by the output connection point of the first component and the data entity type supported by the input connection point of the second component. When the electronic device determines that the type of the data entity output by the output connection point of the first component matches the type of the data entity output by the input connection point of the second component, the electronic device establishes a connection between the first component and the second component.
  • the data entity supported by the output connection point of the first component and the data entity supported by the input connection point of the second component are of the same type, or the data entity supported by the output connection point of the first component includes the data entity supported by the input connection point of the second component. Only when the type of the data entity or the supported data entity type of the input connection point of the second component includes the data entity type supported by the output connection point of the first component, the first component can establish a connection with the second component.
  • FIG. 6A which will not be repeated here.
  • the output connection point of the first component supports video data with a data entity type of MPEG format and a size of 640*480.
  • the input connection point of the second component supports video data whose data entity type is MPEG format and whose size is 640*480, 1080P, and 2K.
  • the data type supported by the output connection point of the first component matches the data type supported by the input connection point of the second component.
  • the electronic device establishes a connection between the first component and the second component.
  • the types of data entities here can refer to Table 1.
  • connection success indicator may be a connection line (Connection) between two components.
  • the connection success identifier may be a connection line (Connection) between two components.
  • a connection line Connection
  • the electronic device draws a connection line, which is used to connect the disassembled video component output connection point (Video Output ConnectPoint) and the playback video input connection point (Input ConnectPoint2).
  • the developer selects the components from the component toolbox 101 one by one and drags them to the component design orchestrator 102 according to the logic flow of the APP.
  • the electronic device sequentially establishes connections to all components constituting the APP in response to user operations.
  • connection success indicator may be that the two connection points of the two components are folded or overlapped. Reference may be made to the description of FIG. 6B above, and details are not repeated here.
  • the electronic device displays a prompt box, which is used to prompt The connection between the user's first component and the second component failed.
  • the prompt content of the prompt box can have many kinds. Exemplarily, the content of the prompt box may be "connection failed". The content of the prompt box can also be "the type of data entity does not match.” The content of the prompt box may also be "the first component and the second component cannot be connected", etc.
  • the specific content of the prompt box is not limited here.
  • the electronic device compiles the multiple connected components into the program code of the first APP.
  • the user can click the control used to produce the program code of the connected component.
  • the electronic device can detect that the user clicks on the control. In response to this operation, the electronic device compiles the connected components into the program code of the first APP.
  • the program code of the first APP is used to describe the logic function and user interface of the first APP.
  • the electronic device may generate the program code of the first APP from the components connected in the assembly orchestration designer 102 through the code generation engine 103.
  • the electronic device saves the orchestration model diagrams of the two or more components that have been connected in the component layout designer, and the first information in the orchestration model diagram; the first information includes two or One or more of the IDs and names of multiple components, and the data entity types supported by the connection point of two or more components.
  • the electronic device generates the executable source code of the first APP in the code generation engine according to the layout model diagram, the first information, and the component calling template.
  • the component calling template includes the program code in a preset format. . For details, please refer to the description of the component calling template above, which will not be repeated here.
  • the embodiment of the application provides an APP development method, by which a user can connect multiple components in the toolbox to form an APP.
  • the electronic device needs to determine whether the data entity types supported by the connection points of the two connected components match. If they match, the electronic device can display a successful connection indicator. Finally, the electronic device generates APP program code from the completed multiple components. In this way, users can quickly develop APPs through existing components and shorten the time for users to develop APPs.
  • Figure 10 shows the application scenario of the APP developed by this application.
  • an APP composed of communication components, storage components, playback components, camera components, gesture input components, audio playback components, and media download acceleration components is used as an example to illustrate.
  • User A has mobile phones, TVs, PCs, routers, speakers, watches, car machines and other electronic equipment. All electronic devices of the user are connected to the same wireless network (for example, a Wi-Fi network in the home).
  • the user's mobile phone is installed with an APP that includes a communication component, a storage component, a playback component, a camera component, a gesture input component, an audio playback component, and a media download acceleration component.
  • a storage component is installed in the PC.
  • the media download acceleration component is installed in the router.
  • An audio playback component is installed in the speaker.
  • a gesture input component is installed in the watch.
  • Camera components are installed on the car.
  • the TV has a playback component installed.
  • the mobile phone can perform the function of the communication component, select the TV to execute the function of the playback component to play the video in the APP, and select the PC to run the function of the storage component to store the APP
  • Select the router to execute the media download acceleration component to accelerate the download of the media files in the APP
  • select the speaker to execute the audio playback function to play the audio in the APP
  • select the watch to execute the gesture input component to input gestures to control the APP
  • select The car machine executes the function of the camera component to shoot the image or video required by the APP.
  • each electronic device can take advantage of its own strengths (for example, the display screen of a TV is larger than the display screen of a mobile phone, mobile phone communication is more convenient, etc.), so that the user has a better experience when using the APP.
  • FIG. 11 shows a schematic diagram of a component development ecology provided by an embodiment of the present application.
  • component developers can query and call components from the component market.
  • Component developers can also summarize and refine components from existing applications.
  • the component developer can upload the developed component or the component extracted from the application to the component market. In this way, a component development ecology is formed.
  • Component developers can use the components in the component market to develop apps extremely conveniently.
  • the above component toolbox 101 can download components from the component market. In this way, the components in the expansion component toolbox 101 can be updated.
  • the electronic device calls various components and connects the components according to the Domain Specific Language (DSL) input by the user.
  • DSL Domain Specific Language
  • Figure 12 shows a schematic diagram of the correspondence between the component DSL language and the component.
  • the DSL language (Comp1; ConnPoint1) in the figure means that there is a component 1, which includes a connection point 1. This corresponds to component 1 and connection point 1 in the component graph.
  • DSL language (Comp2; ConnPoint2) means that there is a component 2, which includes a connection point 2. This corresponds to component 2 and connection point 2 in the component graphic.
  • DSL language (link entity1, entity2) indicates that connection point 1 supports data entity 1, and connection point supports data entity 2.
  • connection point 1 and connection point 2 are connected, then connection point 1 needs to check whether data entity 1 and data entity 2 are the same. Similarly, the connection point 2 also needs to check whether the data entity 1 and the data entity 2 are the same. In this way, when the user is very familiar with the DSL language of the component, the user can develop the APP without following the APP development method flow shown in FIG. 9. Users can write DSL in electronic equipment to develop APP. In this way, users can develop APP more efficiently.
  • the user can directly use the DSL language to connect the disassembled video component and the play video component shown in FIG. 6A. That is, there is no need to perform the operations in Figures 4 to 5G.
  • the DSL language used to decompose the video component and play the video component to connect can be as follows:
  • Videoplayer Videoplayer/video player component
  • Videosplit can represent the split video component in FIG. 6A
  • Connpoint1 corresponds to the output connection point 506 in FIG. 6A
  • Entity1 (vedio, MPEG4, 1920*1080)
  • Videoplayer can represent the video playback component in FIG. 6A
  • Connpoint2 corresponds to the input connection point 602 in FIG. 6A
  • Entity2 (vedio, MPEG4, 1920*1080)" represents the data entity type supported by the output connection point 602.
  • the user can write the judgment condition in the DSL language.
  • the connection point 707 shown in FIG. 7A can be connected to the connection point 709.
  • the connection point 707 may also be a connection point with the connection point 714.
  • the user can set a judgment condition for the connection point 707, and the condition judges whether the language supported by the connection point 707 is consistent with the local language of the electronic device. If so, the output of the video decomposition component is input to the input connection point 714 of the audio playback component through the output connection point 707. If not, the output of the video decomposition component is input to the input connection point 709 of the language translation component 710 through the output connection point 707.
  • the judgment condition for the connection point 707 can be realized by the following DSL language:
  • connect-branch1 can represent the connection line between the connection point 707 and the connection point 714.
  • connect-branch2 may represent the connection line between the connection point 707 and the connection point 709. It is understandable that before the above DSL language, users can define which two connection points connect-branch1 and connect-branch2 specifically represent.
  • the DSL language is only an example, and this application does not limit the implementation of the DSL language.
  • the user can only write out the connection points of the two components that need to be connected.
  • the input connection point 504 and the output connection point 505 of the disassembled video component as shown in FIG. 6A may not be written out. In this way, the user only needs to enter a few lines of code to realize the connection between the components, saving the user's time.
  • FIG. 13 shows a schematic diagram of the structure of the electronic device 100.
  • the electronic device 100 may have more or fewer components than shown in the figure, may combine two or more components, or may have different component configurations.
  • the various components shown in the figure may be implemented in hardware, software, or a combination of hardware and software including one or more signal processing and/or application specific integrated circuits.
  • the electronic device 100 may include a processor 101, a memory 102, a transceiver 103, a display screen 104, a sensor 105, etc., among which:
  • the processor 101 can be used to obtain the data entity type supported by the connection point of the component, and to determine whether the data entity type supported by the connection point of the two components matches, and it is also used to find the data entity type supported by the output connection point of the component according to user operations. Matching components.
  • the processor 101 may include one or more processing units.
  • the processor 101 may include an application processor (AP), a modem processor, and a graphics processing unit (GPU). ), image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural- network processing unit, NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural- network processing unit
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 101 to store instructions and data.
  • the memory in the processor 101 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 101 is reduced, and the efficiency of the system is improved.
  • the processor 101 may include one or more interfaces.
  • the interface can include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter (universal asynchronous transmitter) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, which includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 101 may include multiple sets of I2C buses.
  • the processor 101 may be respectively coupled with a touch sensor, a charger, a flashlight, a camera 193, etc. through different I2C bus interfaces.
  • the processor 101 may couple the touch sensor through an I2C interface, so that the processor 101 and the touch sensor communicate through the I2C bus interface to realize the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 101 may include multiple sets of I2S buses.
  • the processor 101 can be coupled with the audio module 170 through an I2S bus to implement communication between the processor 101 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is usually used to connect the processor 101 and the wireless communication module.
  • the processor 101 communicates with the Bluetooth module in the wireless communication module through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 101 with the display screen 104, the camera 193 and other peripheral devices.
  • the MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 101 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 101 and the display screen 104 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 101 with the camera 193, the display 104, the wireless communication module, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the memory 102 may be used to store computer executable program code, where the executable program code includes instructions.
  • the processor 101 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the memory 102.
  • the memory 102 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100.
  • the memory 102 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the transceiver 103 can be used to communicate with network devices and other electronic devices.
  • the electronic device 100 can upload or download components through the transceiver 103.
  • the transceiver 103 may include a mobile communication module (not shown in the figure) and a wireless communication module (not shown in the figure), wherein:
  • the mobile communication module can provide a wireless communication solution including 2G/3G/4G/5G, etc. applied to the electronic device 100.
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves to radiate through the antenna 1.
  • at least part of the functional modules of the mobile communication module may be provided in the processor 101.
  • at least part of the functional modules of the mobile communication module and at least part of the modules of the processor 101 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs sound signals through audio equipment (not limited to speakers, receivers, etc.), or displays images or videos through the display screen 104.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 101 and be provided in the same device as the mobile communication module or other functional modules.
  • the wireless communication module can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), and global navigation satellite systems. (Global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS Global navigation satellite system
  • frequency modulation frequency modulation, FM
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the wireless communication module may be one or more devices integrating at least one communication processing module.
  • the wireless communication module receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 101.
  • the wireless communication module can also receive the signal to be sent from the processor 101, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the electronic device 100 implements a display function through a GPU, a display screen 104, an application processor, and the like.
  • the GPU is an image processing microprocessor, which is connected to the display screen 104 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 101 may include one or more GPUs, which execute program instructions to generate or change display information.
  • the display screen 104 can be used to display component graphics, toolboxes, component layout designers, and the like.
  • the display screen 104 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 104, and N is a positive integer greater than one.
  • the sensor 105 may be used to detect user operations, for example, the user's operation of dragging a component, the user's operation of sliding a component, and so on.
  • the sensor 105 may include a pressure sensor and a touch sensor, where:
  • the pressure sensor is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor may be provided on the display screen 104.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials. When a force is applied to the pressure sensor, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor.
  • touch operations that act on the same touch position but have different touch operation intensities can correspond to different operation instructions. For example: when a touch operation whose intensity is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the touch sensor also called “touch panel”.
  • the touch sensor may be disposed on the display screen 104, and the touch screen is composed of the touch sensor and the display screen 104, which is also called a “touch screen”.
  • the touch sensor is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 104.
  • the touch sensor may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 104.
  • FIG. 14 is a schematic block diagram of an electronic device 200 according to an embodiment of the application.
  • the electronic device 200 may include a detection unit 201, a processing unit 202, and a display unit 203. in,
  • the detection unit 201 is used to detect user operations received by the electronic device 200, for example, the user drags a component from the component toolbox, the user drags the input connection point of the second component to the output connection point of the first component, and so on.
  • the processing unit 202 is configured to obtain the data entity type supported by the connection point of the component in response to the user operation detected by the detection unit 201, and determine that the output connection point of the first component matches the input connection point of the second component.
  • the display unit 203 is used to display the graphic of the component, the data entity type supported by the connection point of the component, and the indicator that the two components are successfully connected.
  • the units in the user equipment 200 in the embodiment of the present application and the above-mentioned other operations or functions are respectively corresponding processes executed by the electronic device in the APP development method, and will not be repeated here.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of this application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Stored Programmes (AREA)

Abstract

一种APP开发方法以及APP开发平台,该方法包括:用户从组件工具箱中选定组件,电子设备中安装有APP开发平台。组件编排设计器可以创建组件并显示组件的构成,组件可以由一个组件主体和一个或多个连接点组件,连接点支持一种或多种数据实体类型。用户在组件编排设计器的界面中连接第一组件和第二组件;响应于该操作,组件设计编排器连接第一组件和第二组件;用户在组件编排设计器中选择编译完成连接的组件,响应于该操作,代码生成引擎将组件编排设计器中显示的多个完成连接的组件编译成APP的可执行代码。这样,开发人员利用已有的多个组件快速组合成用户需要开发的APP,不需要逐一写程序代码实现APP的功能逻辑。

Description

APP开发平台、APP开发方法及电子设备
本申请要求于2020年06月20日提交中国专利局、申请号为202010569877.8、申请名称“APP开发平台、APP开发方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及电子技术领域,尤其涉及一种APP开发平台、APP开发方法及电子设备。
背景技术
目前,各种电子设备(例如,手机、平板、电脑、车机、智能手表等等)都可以安装应用程序(application,APP)。开发人员需要为这些电子设备开发各种APP。一般地,同一个APP,开发人员需要分别给不同操作系统或不同设备形态的电子设备编写一套程序代码。举例来说,对于同一视频APP,开发人员需要给安卓系统的手机、iOS系统的手机、电脑等各写一套程序代码,且开发人员为一种电子设备写的代码无法直接复用到另一种不同操作系统的电子设备中。
另外,许多APP的程序代码使用的编程语言不同,导致开发人员在开发APP时无法复用现有APP中实现某一功能的程序代码。例如,现有拍照APP和美颜类APP中均有拍照功能。若开发人员需要开发一个具有拍照功能的APP,由于现有的具有拍照功能的APP使用的编程语言或者接口不同,使得开发人员无法直接复用现有APP中实现拍照功能部分的程序代码。这样,开发人员无法利用现有APP中实现某些功能的程序代码去便利地开发一个新的APP,开发一个APP需要耗费较多的时间。
由此,如何提升开发平台开发APP的效率,快速地开发新APP是亟待解决的问题。
发明内容
本申请提供了一种APP开发平台、APP开发方法及电子设备。本申请提供的APP开发平台,用户可以在APP开发平台中选择组件进行连接,APP开发平台可以验证用户选择的组件是否可以连接。当用户选择的组件连接成功后,用户可以选择在APP开发平台中将连接成功的组件编译成APP可执行的源代码。这样,用户可以快速开发APP。
第一方面,本申请提供一种APP开发平台,该APP开发平台应用于电子设备,APP开发平台包括组件工具箱,组件编排设计器,代码生成引擎。其中,组件工具箱用于提供组件,组件为实现特定功能的独立模块,组件由一个组件主体和一个或多个连接点构成,连接点支持一种或多种数据实体类型;组件编排设计器用于显示组件,根据用户连接组件的操作连接两个或多个组件;代码生成引擎用于将组件编排设计器中连接的两个或多个组件生成第一APP可执行的源代码,第一APP包括两个或多个组件。
其中,数据实体,即连接点可以支持的数据。数据实体类型即为数据实体的类型。数据实体类型可以包括音频、视频、图像、文字等多种类型。
通过本申请提供的APP开发平台,用户从组件工具箱中选定组件,电子设备中安装有APP开发平台。组件编排设计器可以创建组件并显示组件的构成,组件可以由一个组件主体和一个或多个连接点组件。用户在组件编排设计器的界面中连接第一组件和第二组件;响应于该操作,组件设计编排器连接第一组件和第二组件;用户在组件编排设计器中选择编译完 成连接的组件,响应于该操作,代码生成引擎将组件编排设计器中显示的多个完成连接的组件编译成APP的可执行代码。这样,开发人员利用已有的多个组件快速组合成用户需要开发的APP,不需要逐一写程序代码实现APP的功能逻辑。
在一种可能的实现方式中,组件编排设计器还用于:响应于用户连接第一组件和第二组件的操作,验证第一组件和第二组件是否匹配;若第一组件和第二组件匹配,则连接第一连接点和第二连接点,第一连接点为第一组件的连接点,第二连接点为第二组件的连接点。这样,可以避免第一组件和第二组件的数据格式不匹配。
在一种可能的实现方式中,第一组件和第二组件匹配包括:第一数据实体类型与第二数据实体类型相同、第一数据实体类型包括第二数据实体类型、或者第二数据实体类型包括第一数据实体类型,其中,第一数据实体类型为第一连接点支持的数据实体的类型,第二数据实体为第二连接点支持的数据实体的类型。
在一种可能的实现方式中,组件工具箱还用于:响应于用户上传组件或从组件市场下载组件的操作,显示用户上传或从组件市场下载的组件的名称。这样,APP平台可以给用户提供更多的组件,提升用户体验。
在一种可能的实现方式中,组件编排设计器具体用于:显示连接第一连接点和第二连接点的连接线。这样,可以提示用户第一连接点和第二连接点成功连接。
在一种可能的实现方式中,组件编排设计器具体用于:将第一连接点和第二连接点重叠显示。这样,可以提示用户第一连接点和第二连接点成功连接。
在一种可能的实现方式中,组件编排设计器具体用于:根据用户从组件工具箱中选择第一组件的操作,显示第一组件。
在一种可能的实现方式中,组件编排设计器还用于:响应于用户选择对第一组件进行智能编排的操作,显示出第一组件的组件调用树;组件调用树用于展示与第一组件匹配的第二组件和/或第三组件、以及与第二组件匹配的第四组件和/或第五组件、直到与第M组件匹配的第N组件,且第N组件为无输出连接点的组件,其中,M和N为正整数。这样,当用户不知道与第一组件匹配的组件时,可以智能给用户推荐可以和第一组件连接的组件,节约用户时间,提升用户体验。
在一种可能的实现方式中,显示出所述第一组件的组件调用树,具体为:根据第一组件的功能和/或第一组件的连接点支持的数据实体类型,显示出第一组件的组件调用树。这样,可以更准确地推荐出第一组件匹配的组件。
在一种可能的实现方式中,组件编排设计器还用于:响应于用户删除组件的操作,删除组件调用树中的组件。
在一种可能的实现方式中,组件编排设计器还用于:保存完成连接的两个或多个组件的编排模型图,以及编排模型图中的第一信息;第一信息包括中两个或多个组件的ID、名称、两个或多个组件的连接点支持的数据实体类型中的一项或多项。
在一种可能的实现方式中,代码生成引擎具体用于:根据编排模型图、第一信息和组件调用模板生成第一APP可执行的源代码,组件调用模板中包括预设格式的程序代码。组件调用模板是按照不同类型的组件,以及连接点的属性来预设好的代码模板,封装了通用的接口调用代码逻辑。
第二方面,本申请提供了一种APP开发方法,该方法包括:响应于用户从电子设备的组件工具箱中选择组件的操作,电子设备在组件编排设计器中显示所述组件,组件为实现特定 功能的独立模块;组件由一个组件主体和一个或多个连接点构成,连接点支持一种或多种数据实体类型;响应于用户连接多个组件的操作,电子设备在组件编排设计器中连接两个或多个组件;响应于用户选择编译的两个或多个组件的操作,电子设备在代码生成引擎中将连接的两个或多个组件生成第一APP的可执行的源代码。
其中,数据实体为连接点可以支持的数据。数据实体类型即为数据实体的类型。数据实体类型可以包括音频、视频、图像、文字等多种类型。
这样,用户可以利用已有的组件连接得到APP,不需要重新的编写APP的代码,可以节约用户开发APP的时间。
在一种可能的实现方式中,电子设备在组件编排设计器中连接两个或多个组件,具体包括:响应于用户连接第一组件和第二组件的操作,电子设备通过组件编排设计器验证第一组件和第二组件是否匹配;若第一组件和第二组件匹配,则电子设备连接第一连接点和第二连接点,第一连接点为第一组件的连接点,第二连接点为第二组件的连接点。这样,可以保证用户连接的两个组件是匹配的。
在一种可能的实现方式中,第一组件和第二组件匹配包括:第一数据实体类型与第二数据实体类型相同、第一数据实体类型包括第二数据实体类型、第二数据实体类型包括第一数据实体类型,第一数据实体类型为第一连接点支持的数据实体的类型,第二数据实体为第二连接点支持的数据实体的类型。
在一种可能的实现方式中,该方法还包括:响应于用户查看第一组件的第一连接点属性的操作,电子设备在组件编排设计器中显示第一连接点支持的数据实体类型。这样,用户可以知道第一组件的第一连接点的属性,以便于用户后续操作,例如,根据第一连接点的属性查找与第一组件匹配的第二组件。
在一种可能的实现方式中,电子设备连接第一连接点和第二连接点包括:显示连接第一连接点和第二连接点的连接线;或,将第一连接点和第二连接点重叠。这样,可以提示用户第一连接点和第二连接点已经建立连接。
在一种可能的实现方式中,该方法还包括:响应于用户选择对第一组件进行智能编排的操作,电子设备在组件编排设计器中显示出第一组件的组件调用树;组件调用树用于展示第一组件匹配的第二组件和/或第三组件、以及与第二组件匹配的第四组件和/或第五组件、直到与第M组件匹配的第N组件,且第N组件为无输出连接点的组件,其中,M和N为正整数。这样,当用户不知道与第一组件匹配的组件时,可以智能给用户推荐可以和第一组件连接的组件,节约用户时间,提升用户体验。
在一种可能的实现方式中,电子设备在组件编排设计器中显示出第一组件的组件调用树,具体包括:根据第一组件的功能和/或第一组件的连接点支持的数据实体类型,显示出第一组件的组件调用树。这样,可以更准确地推荐出第一组件匹配的组件。
在一种可能的实现方式中,该方法还包括:响应于用户删除组件的操作,电子设备删除组件调用树种的组件。这样,用户可以删除组件调用树中跟第一APP无关的组件。
在一种可能的实现方式中,该方法还包括:电子设备在组件编排设计器中保存完成连接的两个或多个组件的编排模型图,以及编排模型图中的第一信息;第一信息包括中两个或多个组件的ID、名称、两个或多个组件的连接点支持的数据实体类型中的一项或多项。
在一种可能的实现方式中,该方法还包括:电子设备在所述代码生成引擎中,根据编排模型图、第一信息和组件调用模板生成第一APP可执行的源代码,组件调用模板中包括预设 格式的程序代码。组件调用模板是按照不同类型的组件,以及连接点的属性来预设好的代码模板,封装了通用的接口调用代码逻辑。
第三方面,本申请提供了一种电子设备,包括:一个或多个处理器、一个或多个存储器;该一个或多个存储分别与一个或多个处理器耦合;该一个或多个存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令;当该计算机指令在该处理器上运行时,使得该电子设备执行上述任一方面任一种可能的实现方式中的APP开发方法。
第四方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得通信装置执行上述任一方面任一项可能的实现方式中的APP开发方法。
第五方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的APP开发方法。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对本申请实施例中所需要使用的附图进行说明。
图1A为本申请实施例提供的一种视频解码组件的示意图;
图1B为本申请实施例提供的一种复合组件的示意图;
图2为本申请实施例提供的一种APP开发平台10的架构示意图;
图3为本申请实施例提供的组件工具箱101的示意图;
图4为本申请实施例提供的一种用户界面的示意图;
图5A-图5G为本申请实施例提供的用户界面的示意图;
图6A-图6B为本申请实施例提供的用户界面的示意图;
图7A-图7B为本申请实施例提供的用户界面的示意图;
图8A-图8D为本申请实施例提供一种用户界面示意图;
图9为本申请实施例提供的一种APP开发方法的流程示意图;
图10为本申请实施例提供的一种应用场景示意图;
图11为本申请实施例提供的一种组件开发生态示意图;
图12为本申请实施例提供的组件的领域描述语言与组件图形对应关系示意图;
图13为本申请实施例提供的电子设备的硬件结构示意图;
图14为本申请实施例的电子设备的示意性框图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清除、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是 两个或两个以上。此外,本申请的描述中所提到的术语“包括”和“具有”以及它们的任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括其他没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。需要说明的是,本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
首先对本申请实施例涉及到的一些概念(如组件Component、连接点ConnectPoint)进行介绍。
(1)组件(Component)
组件是一个由一定业务逻辑、以及相关数据组成的独立模块。组件可以完成特定的功能。例如,视频播放组件可以完成播放视频的功能、视频解码组件可以完成对视频解码的功能。一个APP可以由一个或多个组件组成。组件可以有用户界面,也可以没有用户界面。例如,视频解码组件可以没有用户界面,即电子设备通过视频解码组件对视频解码的过程可以不展现在用户界面中,用户不感知。单个组件可以独立运行于用户设备中,例如,TV中可以安装有播放视频组件,然后TV通过该播放视频组件播放视频。
每个组件包括一个组件主体和若干个连接点(输入连接点或输出连接点)。一个组件通过连接点来实现与其他组件传递数据。一般地,组件至少包括一个输入连接点或输出连接点。可以理解的是,有的组件也可以没有连接点,即这类组件无输入连接点和输出连接点。在本申请实施例中,可以用来组成APP的组件至少有一个或多个连接点。
示例性地,一个组件的构成可以如图1A所示。图1A示出了视频解码组件的图形示例。视频解码组件可以包括一个组件主体(如图1A中解码视频10a)和一个输入连接点(如图1A中输入连接点Input ConnectPoint10b)、以及二个输出连接点(如图1A中输出连接点Output ConnectPoint10c和输出连接点10d)。
(2)连接点(ConnectPoint)
连接点是组件的对外的智能化接口,即实现输入或输出功能的代理者,负责与其他连接点进行接口协议检查、协商对接、数据传递等。连接点是组件接收另一个组件输入数据的途径,也是组件向另一个组件输出数据的途径。连接点包括输入连接点或者输出连接点。通常,每个连接点可以支持一种或多种类型的数据实体Entity。连接点支持的数据实体可以是图像、音频、视频、文字等等。
(3)数据实体(Entity)
数据实体,即连接点可以支持的数据。数据实体类型即为数据实体的类型,数据实体有多种类型,例如音频、视频、图像、文字等等。如表1所示,表1示例性的示出了数据实体的一些类型。其中视频可以按照不同的视频编码标准分类。例如,视频编码标准可以包括动态图像专家组(moving pictures experts group,MPEG)压缩编码标准和高效视频编码(high efficiency video coding,HEVC)标准。其中,MPEG压缩编码标准可以包括MPEG、MPEG2和MPEG4等几种编码标准。HEVC可以包含编码标准H265。音频的文件可以有多种格式。例如动态影像专家压缩标准音频层面3(moving picture experts group audio layer III,简称MP3)格式的音频文件、波形文件(waveform Audio,简称WAVE)格式的音频文件、无损音频压 缩编码(Free Lossless Audio Codec,FLAC)格式的音频文件。数据实体的类型还可以包括文本、图片、压缩文件、文档、流和数据集等等。流可以包括视频流、音频流或者复合流。复合流中可以包含视频流和音频流。数据集可以包括数据库表。其中,每一个类型还可以包括一些子类型,格式或者一些其他属性。例如,视频类型数据实体类型可以MPEG、MPEG2和MPEG4等子类型,还可以包括:宽度、高度、字幕、语言等等属性。例如,图1A中输入连接点10b支持的数据实体类型可以为MPEG格式的视频、格式为1920*1080,语言为中文等等。具体可以参考表1,此处不再赘述。
表1
Figure PCTCN2021098215-appb-000001
Figure PCTCN2021098215-appb-000002
(4)复合组件
复合组件由多个组件组成,可以完成多个功能。例如,如图1B所示,复合组件100A可以由视频解码组件和视频播放组件组成。其中,视频解码组件包括一个主体(解码视频10a)、一个输入连接点(输入连接点10b)和两个输出连接点(输出连接点10c和输出连接点10d)。视频播放组件包括一个主体(播放视频10e)和一个输入连接点(输入连接点10f)。视频解码组件的输出连接点10d和视频播放组件的输入连接点10f连接。复合组件100A可以实现视频解码和视频播放的功能。
为了解决开发APP比较耗时的问题,在一种实现方式中,开发人员可以在电子设备上通过“快捷指令”应用将现有快捷指令或自定义快捷指令组成新的APP。即开发人员可以通过“快捷指令”应用合并终端中多个应用间的多个操作,以创建一个APP。举例来说,电子设备中应用A具有拍照这个操作。电子设备中应用B具有一键将照片转换为PDF文档这个操作。开发人员可以通过“快捷指令”将应用A中的拍照操作和应用B中一键将照片转换为PDF文档操作串接组合成一个新APP。这样,开发人员可以较为快捷地开发一个新的APP。
但是,这种实现方式中提出的方法只能简单的组合电子设备中的一些操作。当开发人员需要开发一个功能较多的,业务逻辑比较复杂的APP时,上述实现方式中提出的方法难以满足开发人员的需求。
针对上述实现方式中存在的问题,本申请提供了一种APP开发平台。该APP开发平台包括组件工具箱,组件编排设计器以及代码生成引擎。基于该APP开发平台,本申请提出一种APP开发方法。该方法包括:电子设备检测到用户从组件工具箱中选定组件的操作,电子设备中安装有APP开发平台。响应于该操作,组件编排设计器可以创建组件并显示组件的构成,组件可以由一个组件主体和一个或多个连接点组件。电子设备检测到用户在组件编排设计器的界面中连接第一组件和第二组件的操作;响应于该操作,组件设计编排器判断第一组件的输出连接点支持的数据实体类型和第二租件的输入连接点支持的数据实体类型是否匹配。若匹配,组件设计器编排器可以显示连接成功的标识。电子设备检测到用户在组件编排设计器中选择编译完成连接的组件的操作,响应于该操作,代码生成引擎将组件编排设计器中显示的多个完成连接的组件编译成APP的可执行代码。这样,开发人员利用已有的多个组件快速组合成用户需要开发的APP,不需要逐一写程序代码实现APP的功能逻辑。
在本申请实施例中,开发人员又可以称为用户,该用户可以在本申请提供的电子设备中 开发APP或组件。
首先对本申请实施例提供的一种APP开发平台进行介绍。如图2所示,图2示出了本申请实施例提供的APP开发平台10的架构示意图。本申请实施例提供的APP开发平台10包括:组件工具箱101、组件编排设计器102、代码生成引擎103。
组件工具箱101用于呈现组件。组件工具箱101中的组件可以按照组件的功能分类。用户可以从组件市场中下载组件,然后保存在组件工具箱101中。用户也可以将自己开发设计的组件上传到组件市场中。
示例性地,组件工具箱101可以如图3所示,图3示出了本申请实施例提供的组件工具箱101。组件工具箱101可以显示在用户界面(未示出)的显示区域1000中。显示区域1000中可以包括控件1001、控件1002、控件1003以及控件1004。用户可以通过控件1001搜索组件工具箱101中的组件。控件1002和控件1003用来展开或收起某一类组件。例如,图3中控件1002用来展开或收起常用组件类的组件。控件1003用来展开或收起影音播放类的组件。图3中示例性的示出了组件工具箱101中包含常用组件类的组件和影音播放类的组件。可以理解的是,组件工具箱101中的组件不限于常用组件类的组件和影音播放类的组件。组件工具箱101中还可以包含其他种类的组件,例如,文档处理类的组件、图片处理类的组件等等。可以理解的是,本申请实施例对组件工具箱101的具体用户界面不做限制。组件工具箱101的用户界面中可以具有比图3中更多或更少的控件。
在一些实施例中,用户可以按照自己的使用习惯对组件进行分类,例如用户可以将播放视频组件添加到常用组件这一分类中。用户可以将预定机票组件、预定酒店组件、订餐外卖分到预定支付类组件中,具体分类名称可由用户自行定义。可以理解的是,不同用户中的组件工具箱101中组件列表中的组件分类可以不一样。例如,图3中示出的常用组件类(例如发送短信组件、拨打电话组件等等)、影音播放类(例如播放视频组件、解码视频组件、解码音频组件等等)的组件等等。关于如何对组件进行分类,本申请实施例不作限制。本申请实施例下文以根据组件的功能对组件分类为例进行阐述。
组件编排设计器102是组件编排和开发的核心工具,具体的,开发者可以在编排设计器中,选择、布局和连接多个组件,选择连接点数据实体类型,设定业务逻辑,构成新的复合组件等。组件编排设计器可具体用于:
显示生成组件:示例性的,组件编排设计器102响应于用户将组件从组件工具箱101中拖拽到组件编排设计器102中,组件编排设计器102可为用户呈现该组件。具体的,开发者可以在组件工具箱101中选择组件,然后组件编排设计器102读取该组件的文件。组件编排设计器102根据该组件的文件中获得组件的组成并将组件绘制出来,呈现在组件编排设计器102中。
连接点对接验证:验证两个组件的连接点的数据实体类型是否一致。
组件连接:根据用户操作将两个或多个组件进行连接。示例性地,开发者可以根据所需APP的业务逻辑或者符合组件的业务逻辑在组件编排设计器102中将两个或多个组件进行连接。在两个组件进行连接时,组件编排设计器102还可以验证两个组件的连接点支持的数据类型是否一致。例如,当视频播放组件的输出连接点和视频解码组件的输入连接点支持相同的数据实体类型时,可以将两个组件之间的连接点进行连接。
业务逻辑编排:开发者可根据组件连接点支持的数据实体类型对连接点增加逻辑判断或 循环。
智能化编排:组件编排设计器102可根据组件的功能和组件连接点支持的数据实体类型,自动联想和推荐可以与当前组件对接的其他组件,供开发者选择;或者按照编排策略自动生成对接编排模型。示例性地,组件编排设计器102可以根据用户操作,显示所有能够与第一组件的连接点连接的组件。用户可以在组件编排设计器102显示的组件中选择需要的组件。
支持查看连接点属性:组件编排设计器102可以响应于用户查看组件连接点支持的数据类型的操作,显示该连接点支持的数据类型。
保存组件编排模型图:示例性地,组件编排设计器102还可用于保存多个组件编排完成的模型图,以及模型图中的所有信息。例如,模型图中所有组件的ID和名称、所有连接点的数据实体类型、连接点的连接属性(包括两个相连的连接点支持的数据实体类型,以及数据传输方式(直接传输或需要判断条件,满足判断条件才传输))。
代码生成引擎103用于将编排好的复合组件、APP生成可执行的源代码。具体地,代码生成引擎103根据组件编排设计器102中保存的多个组件编排完成的模型图,以及模型图中的所有信息。例如,模型图中所有组件的ID和名称、所有连接点的数据实体类型、连接点的连接属性和组件调用模板生成可执行的源代码。其中,组件调用模板是按照不同类型的组件,以及连接点的属性来预设好的代码模板,封装了通用的接口调用代码逻辑。示例性地,组件调用模板可以包括如下程序代码:
Figure PCTCN2021098215-appb-000003
Figure PCTCN2021098215-appb-000004
上述代码中除了<>中的内容之外,其余的代码均为模板。代码生成引擎可以根据具体生成的APP替换上述代码中<>的内容。例如,<ComposePlayer>中的“ComposePlayer”可以替换为具体APP的名称(如123player)。<VideoSplit>中的“VideoSplit”可以替换为实际生成APP所需的组件名称。<ConnectPoint1>中的“ConnectPoint1”可以替换为实际的连接点。<entity.MPEG>,<entity.640_480>中“MPEG”和“640_480”可以替换为实际的数据实体类型。
可以理解的是,本申请中的APP开发平台可以作为一个单独开发APP的工具,也可以是一个开发工具中的一个模块。此处不作限定。
下面通过附图介绍利用本申请实施例提供的APP开发平台开发APP的实现过程。
图4-图8D示例性地示出了开发一个视频APP的过程。本申请实施例以电子设备是电脑为例进行阐述。
如图4所示,图4示出了用户界面400。用户界面400用于用户创建APP。用户界面400中可以包括输入框401、输入框402、输入框403、输入框405、以及控件404、控件406、控件407、控件408、控件409、控件410。其中,输入框401用于输入用户所创建的项目名称,即APP名称,例如“MediaPlayerAPP”。输入框402用于用户输入用户所创建APP的包名(Package Name),例如“com.ex.example.mediaplayerapp”。包名是APP的唯一标识,主要用于系统识别APP。输入框403用于用户选择或输入所创建APP的存储位置。用户可以在输入框 403中直接输入所创建APP的存储位置,例如“D:\test\mediaplayerapp”。用户还可以通过输入框403中的控件404来选择所创建APP的存储位置。输入框405用于用户选择或直接输入所创建APP所支持的API最低版本。用户可以在输出框405中直接输入所创建APP支持的API最低版本,例如“xxsdk:1.0.0”。用户还可以通过控件406选择所创建APP支持的API最低版本。控件407用于指导用户如何在用户界面400中进行操作。控件408用于取消用户创建的项目。控件409用于回到上一步操作。控件410用于保存用户创建的项目并刷新用户界面400。当用户填完输入框401、输入框402、输入框403、输入框405中的内容后,点击控件410。
示例性的,响应于用户点击控件410的操作,显示如图5A所示用户界面50A。其中,用户界面50A中可以包含显示区域501和显示区域502。显示区域501用于显示组件工具箱101。当用户在组件工具箱101中选中一个组件(例如分解视频组件)拖到显示区域502中,显示区域502可以显示出组件的图形。这里,显示区域502可以是组件设计编排器102的界面。组件设计编排器102的界面可以用于显示组件的图形。
在一种可能的实现方式中,当电子设备检测到用户在组件工具箱101中拖动分解视频组件的操作时,响应于该用户操作,组件编排设计器102可以创建分解视频组件,以及在显示区域502中绘制出分解视频组件的主体(例如用户界面50A中的分解视频503),以及连接点(例如用户界面50A中的输入连接点504和输出连接点505、输出连接点506)。
用户可以查看组件连接点支持的数据实体类型。电子设备可以检测到用户查看连接点的操作,为用户显示该组件连接支持的数据实体类型。该操作可以有很多种,例如双击连接点,鼠标光标悬停在连接点上2s,或者鼠标右击连接点等等,此处不做限定。如图5B所示,用户可以右击输出连接点506,电子设备检测到用户操作,显示如图5C中的用户界面50C,用户界面50C中可以显示查看控件507和智能编排控件508。查看控件507用于显示输出连接点506支持的数据实体类型。智能编排控件508可用于智能编排,显示能够和输出连接点506连接的组件。示例性的,用户可以点击查看控件507。
示例性地,响应于用户点击查看控件507的操作,显示如图5D中所示的用户界面50D,用户界面50D中可以显示输出连接点506支持的数据实体属性表509。数据实体属性表509中可以用来显示输出连接点506支持的数据实体的数据类型为视频、视频的子类型可以是MPEG和MPEG4、以及视频的格式可以为640*480和1920*1080等等。
在一种可能的实现方式中,用户还可以通过点击用户界面50D中分解视频组件的组件主体分解视频503,来查看分解视频组件所有连接点各自支持的数据实体类型。查看分解视频组件所有连接点各自支持的数据实体类型的方式可以是双击连接点,鼠标光标悬停在连接点上2s,或者鼠标右击连接点等,此处不做限制。示例性的,响应于该用户点击分解视频组件的组件主体分解视频503的操作,电子设备可以显示如图5E示出的用户界面50E。用户界面50E中可以展示分解视频组件的属性表510。属性表510用于展示分解视频组件所有连接点的属性。属性表510中的连接点1可以为输入连接点504,连接点2可以是输出连接点505,连接点3可以是输出连接点506。连接点1可以包含两种数据实体,即数据实体0和数据实体1。控件511用于隐藏连接点的数据实体。即,用户点击控件511,属性表510中显示出的数据实体0、数据实体1和数据实体被隐藏。控件512用于隐藏数据实体0的属性,例如,类型(category),子类型(sub category),宽度(width)、高度(height)、比特率(bitrate)等等。控件513用于展开显示数据实体1的属性。控件514用于展开显示连接点2支持的数据实体。控件515用于展开显示连接点3支持的数据实体。
当连接点支持多种类型的数据实体时,用户可以选择一种数据实体设定为连接点可支持的数据实体。例如,图5F中示出的用户界面50F中,输出连接点506支持多种类型的数据实体。具体地,输出连接点506支持的数据实体可以为子类型为MPEG,格式为640*480的视频数据、和子类型为MPEG,格式为1920*1080的视频数据、和子类型为MPEG4,格式为640*480的视频数据、以及文件类型为MPEG4,格式为1920*1080的视频数据。用户可以在数据实体属性表509中将MPEG4设定为输出连接点506可以支持的子类型。电子设备可以检测到用户设置连接点所述支持数据实体属性的操作。用户设置连接点所述支持数据实体属性的操作可以有多种,例如用户可以双击数据实体属性表格509中的子类型MPEG4。此处对用户设置连接点所述支持数据实体属性的操作不作限定。响应于该用户操作,电子设备可以显示提示框516。提示框516用于提示用户已经对数据实体进行了设置。提示框516的提示内容可以是图5E中显示的文字内容“你已将连接点的子类型设置为MPEG4”,此处对提示框516的具体提示内容不作限定。
在另一种可能的实现方式中,用户可以在图5E示出的属性表510中设置连接点支持的数据实体。例如,用户可以双击属性表510中的数据实体0,响应于用户操作,组件编排设计器102将数据实体0设定为连接点1唯一支持的数据实体。
用户可以依次设定好输出连接点506支持的数据实体(例如视频)的子类型以及格式。例如,用户可以将视频的子类型设置为MPEG4,格式设置为1920*1080。用户可以再次点击查看控件507查看输出连接点506支持的数据实体的属性。如图5G示出的用户界面50G所示,用户界面50G显示了更新后的数据实体属性表511。数据实体属性表511中的子类型和格式均为用户设定,即子类型为MPEG4,格式设置为1920*1080。
同理,用户可按照如上方法再添加组件工具箱101中的其他组件到组件编排设计器102中。示例性的,用户可以将组件工具箱101中的播放视频组件拖动到组件编排设计器102中。电子设备可以检测到用户拖动播放视频组件的操作,响应于该操作,组件编排设计器102可以创建视频播放组件并绘制视频播放组件的组件主体(如用户界面60A中示出的播放视频601)以及连接点(如用户界面60A中示出的输入连接点602),并显示在组件编排设计器102的界面中(例如用户界面60A)。
如图6A所示,用户可以将组件工具箱隐藏。并且用户可以在组件编排设计器102的界面中将两个组件进行连接。例如,如图6A示出的用户界面60A。电子设备可以检测到用户将分解视频组件与播放视频组件进行连接的操作。用户将两个组件进行连接的操作可以有多种,例如,用户可以将输出连接点506拖向输入连接点602,或者,用户也可以将输入连接点602拖向输出连接点506,此处不作限定。响应于该用户操作,组件编排设计器102获取输出连接点506和输入连接点602所支持的数据实体类型。组件编排设计器102判断输出连接点506支持的数据实体类型和输入连接点602支持的数据实体类型是否匹配。输出连接点506支持的数据实体类型和输入连接点602支持的数据实体类型匹配是指,连接点506支持的数据实体类型和输入连接点602支持的数据实体类型相同,或者,输出连接点506支持的多个数据实体类型中至少一个数据实体类型与输入连接点602支持的数据实体类型相同。若匹配,组件编排设计器102的界面中可以显示连接成功的标识。若不匹配,组件编排设计器102的界面中可以提示用户不匹配(图中未示出)。这里,提示两个组件的连接点(例如输出连接点506和输入连接点602)不匹配的方式有多种,例如,用户界面中显示提示文字“无法连接”、“匹配失败”、“连接失败”、“数据实体类型不一致”等等,此处不作限定。在该种实现 方式中,即使组件的输入连接点或输出连接点支持多种数据类型,编排器可根据两个连接点支持相同的数据实体类型将输入连接点和输出连接点连接。
示例性地,在另一种可能的实现方式中,当连接点支持多种实体类型时,用户可以先将输出连接点506支持的数据实体类型和输入连接点602支持的数据实体类型设置为相同类型,然后再将输出连接点506和输入连接点602进行连接。此时,由于输出连接点506和输入连接点602所支持的数据实体类型相同,组件编排设计器102可以连接输出连接点506和输入连接点602。
两个组件的连接完成的用户界面可以如图6A所示。图6A示出了本申请实施例提供的用户界面60A。用户界面60A中分解视频组件的输出连接点506和播放视频的输入连接点602成功连接。组件编排设计器102的界面中显示的连接成功的标识可以有多种,例如用户界面60A中示出的输出连接点506与输入连接点602之间的连接线603,此处对连接成功的标识不作限定。
示例性地,在一种可能的方式中,两个组件的连接完成后的用户界面也可以如图6B所示。图6B示出的用户界面60B中分解视频组件的输出连接点506和播放视频的输入连接点602成功连接。输出连接点506和输入连接点602折叠或重叠在一起,即表明输出连接点506和输入连接点602成功连接。
当用户选定所创建APP需要的所有组件后,响应于用户的连接操作,组件编排设计器102可以将组件进行连接,具体连接过程可以参考上述分解视频组件与视频播放组件连接的描述,此处不再赘述。连接完成后的用户界面可以如图7A所示。图7A示出了本申请提供的用户界面700。用户界面700中展示了组成用户所创建APP(即视频播放APP)的所有组件。视频播放APP的所有组件可以包括读取视频文件组件702、分解视频组件705、语音翻译组件710、高清增强组件711、音频播放组件716、视频播放组件717。其中,读取视频文件组件可以包括输入连接点701和输出连接点703。分解视频组件705可以包括输入连接点704、输出连接点706、输出连接点707。语音翻译组件710可以包括输入连接点709和输出连接点712。高清增强组件711可以包括输入连接点708和输出连接点713。音频播放组件716可以包括输入连接点714。视频播放组件717可以包括输入连接点715。输出连接点703和可以输入连接点704连接。输出连接点707和可以输入连接点709、或输入连接点714连接。输出连接点706可以和输入连接点708、或输入连接点715连接。输出连接点712可以和输入连接点714连接。输出连接点713可以和输入连接点715连接。
在一些实施例中,用户可以给连接点之间的连接线设置属性。连接线的属性可以包括逻辑控制属性(条件判断/分支/循环)或数据传递属性(Entity的输入输出)。图7A中,对分解视频组件705的输出连接点707,可以判断其语言属性,如果与本地语言一致,可以跳过语音翻译组件710,直接与声音播放组件的输入连接点714连接。即,用户可以对连接点707设置判断条件,该条件判断是连接点707支持的语言是否和电子设备的本地语言一致。若是,则视频分解组件的输出通过输出连接点707输入音频播放组件的输入连接点714。若否,则视频分解组件的输出通过输出连接点707输入语言翻译组件710的输入连接点709。
示例性地,用户可以点击连接线左端点上的连接点来设置连接线属性。例如图7A中示出的连接点707到连接点714之间的连接线,以及连接点707到连接点709之间的连接线。用户可以点击连接点707来设置连接点707到连接点714,以及连接点707到连接点709之间的连接线的控制属性。设置连接线的控制属性的界面可以如图7B中用户界面70B所示。 用户界面70B中可以包括条件设置框720、条件设置框722、条件设置框724、条件判断结果选择框725、控件721、控件723、控件726、控件727、控件728、控件729。用户界面70B中还可以包括条件设置框730、条件设置框732、条件设置框734、条件判断结果选择框735、控件731、控件733、控件736、控件737。
其中,条件设置框720用于设置判断主体,例如语言(language)。控件721用于选择判断的条件。条件设置框722用于设置判断主体和比较主体的关系,例如判断主体可以包含比较主体。控件723用于选择判断主体和比较主体的关系。例如,该关系可以为包含或者不包含,此处不作限定。条件设置框724用于设置比较主体,例如本地语言。条件设置框720、条件设置框722和条件设置框724共同完成判断条件设置。例如用户界面70B中示出的判断条件1为“连接点707的语言包含本地语言”。条件判断结果选择框725用于设置条件判断结果。例如若满足条件“连接点707的语言包含本地语言”,则连接点707的数据实体通过连接点707和连接点714之间的连接线传输到连接点714。控件726用于选择条件判断结果。控件727用于增加条件。控件728用于完成条件设置。控件729用于取消条件设置。控件737用于返回上一级界面。同样地,条件设置框730用于设置判断主体,例如语言(language)。控件731用于选择判断的条件。条件设置框732用于设置判断主体和比较主体的关系。控件733用于选择判断主体和比较主体的关系。条件设置框734用于设置比较主体,例如本地语言。条件设置框730、条件设置框732和条件设置框734共同完成判断条件设置。例如用户界面70B中示出的判断条件2为“连接点707的语言不包含本地语言”。条件判断结果选择框735用于设置条件判断结果。例如若满足条件“连接点707的语言不包含本地语言”,则连接点707的数据实体通过连接点707和连接点709之间的连接线传输到连接点709。控件736用于选择条件判断结果。
图7A中的所有组件连接完成后,电子设备可以检测到用户将连接完成的组件编译成可执行代码的操作。该用户操作有很多种,例如图7A中用户点击用户界面700的编译控件718,此处不作限定。响应于该用户操作,代码生成引擎103将其编译成用户所创建APP的可执行程序代码。
具体地,代码生成引擎103可以获取组件编排设计器102中完成编排的多个组件的ID和名称,以及连接点,连接点之间的连接属性。例如图6A中示出的分解组件的输出连接点506和播放视频组件的输入连接点602,两个连接点之间传输的数据实体类型可以为文件类型为MPEG4,格式为1920*1080的视频数据。这两个连接点之间直接传递数据,没有判断或者循环逻辑。然后,代码生成引擎103根据获取的组件信息以及连接点之间的连接属性和对应组件调用模板生成可执行代码。组件调用模板是按照不同类型的组件,以及连接点的属性预设好的代码模板,封装了通用的接口调用代码逻辑。可以理解的是,视频类的组件和音频类的组件调用模板可以不同。两个连接点之间是直接传输数据,还是需要判断条件,满足判断条件才可以传输数据等不同的数据传输方式对应的组件调用模板也可以不同。具体可参考上文中对组件调用模板的介绍,此处对组件调用模板中的具体代码不作限定。
在一些例子中,电子设备接收用于显示第一组件的组件调用树的用户操作,响应于该用户操作,组件编排设计器102启动智能编排功能;基于该智能编排功能,组件编排设计器102的界面中可以显示出第一组件的组件调用树;组件调用树用于展示第一组件所有匹配的第二组件和/或第三组件、以及与第二组件匹配的第四组件和/或第五组件、一直到与第M组件匹配的第N组件,且第N组件为无输出连接点的组件,其中,M和N为正整数。具体地,组 件编排设计器102根据第一组件的输出连接点支持的数据实体类型,查找到相匹配的第二组件和/或第三组件。即第二组件以及第三组件的输入连接点支持的数据实体类型与第一组件输出连接点支持的数据实体类型匹配。然后,组件编排设计器102再依次根据第二组件输出连接点支持的数据实体类型,查找与第二组件相匹配的第四组件和/或第五组件。第四组件以及第五组件的输入连接点支持的数据实体类型与第二组件输出连接点支持的数据实体类型相匹配。组件编排设计器102会查找出组件调用树中每一个组件相匹配的组件,直到组件调用树的最后一层组件为无输出连接点。
用于显示第一组件的组件调用树的操作可以有多种,例如,用户可以点击用户界面中用于显示组件调用树的控件,或者,用户可以在页面中设置显示组件调用树的条件,例如组件调用树中的组件可以实现一种或多种功能(例如视频解码功能、播放视频功能等等),此处不作限定。
响应于该用户操作,电子设备显示如图8A中示出的用户界面80A,用户界面80A中组件1的组件调用树。组件1可以和组件2和组件3连接。即组件1和组件2相匹配,组件1和组件3相匹配。组件2可以和组件4和组件5相连接。即组件2和组件4相匹配,组件2和组件5相匹配。组件3可以和组件m相连接,即组件3和组件m相匹配。组件4、组件5、组件m可以和组件n连接。即组件4和组件n相匹配,组件5和组件n相匹配,组件m和组件n相匹配。用户可以选择组件调用树中的部分组件,例如选择组件2和组件4、组件n。然后将其他不需要的组件移除或删除。例如,用户右键双击组件来删除组件,移除或删除的方式不作限定。
表2
Figure PCTCN2021098215-appb-000005
表2示例性地示出了组件1、组件2、组件3、组件m的连接点支持的数据实体类型。表2中“MPEG4、1920*1080、Chinese”表示该连接点支持的数据实体为格式是MPEG4的视频数据,该视频数据分辨率为1920*1080、视频数据中支持的语言为中文。“MP3、112800、Chinese”表示连接点支持的数据实体为格式是MP3的音频数据,该音频数据的采用率为112800Hz, 支持的语言为中文。从表2可以看出,组件2的输入连接点支持的数据实体类型中包含组件1的输出连接点支持的数据实体类型。因此,组件1的输出连接点可以和组件2的输入连接点连接,组件1和组件2是相匹配的。组件3的输入连接点支持的数据实体类型中包含组件1的输出连接点支持的数据实体类型。因此,组件1的输出连接点可以和组件3的输入连接点相连接,组件1和组件3是相匹配的。组件4的输入连接点支持的数据实体类型包括组件2的输出连接点2支持的数据实体类型。因此,组件4的输入连接点可以和组件2的输出连接点2相连接,组件2和组件4是相匹配的。组件5的输入连接点支持的数据实体类型包含组件2的输出连接点2支持的数据实体类型。因此,组件5的输入连接点可以和组件3的输出连接点2相连接,组件5和组件2是相匹配的。组件m的输入连接点支持的数据实体类型包含组件2的输出连接点支持的数据实体类型。因此组件m的输出连接点可以和组件3的输入连接点连接,组件3和组件m是相匹配的。组件n的输入连接点支持的数据实体类型包含了组件4的输出连接点、组件5的输出连接点以及组件m的输出连接点支持的数据实体类型。因此,组件n的输入连接点可以和组件4的输出连接点、组件5的输出连接点以及组件m的输出连接点连接。组件n和组件4、组件5、以及组件m均相匹配。
可以理解的是,表2示出的各个组件的连接点支持的数据实体类型仅为示例。图8A中示出的各个组件的连接点可以支持表1中示出的数据实体类型中的任何一种。本申请实施例对图8A中各个组件的连接点可以支持的数据实体类型不作限定。
示例性地,如图8B所示,用户界面80B可以显示有分解视频组件,分解视频组件可以包括一个主体(分解视频503)和输入连接点504,输出连接点505,输出连接点506。用户不知道组件工具箱101中有哪些组件可以分解视频组件的输出连接点506连接。用户可以点击用户界面80B中智能编排控件508。
响应于该用户操作,电子设备显示如图8C示出的用户界面80C。用户界面80C中显示出分解视频组件的组件调用树。如图8C所示,分解视频组件的组件调用树中可以包括一个视频播放组件,该播放视频组件包括一个主体(播放视频717)和输入连接点715。当用户点击智能编排控件508时,组件编排设计器102可以根据输出连接点506的数据实体类型去组件工具箱101中寻找与输出连接点匹配的组件(例如视频播放组件),并显示在组件编排设计器102的界面(例如用户界面80C)中。
在一种可能的实现方式中,响应于该用户操作,电子设备也可以显示如图8D示出的用户界面80D。用户界面80D中显示出分解视频组件的组件调用树。如图8D所示,分解视频组件的组件调用树中可以包括一个高清增强组件和一个视频播放组件。该高清增强组件包括一个组件主体(高清增强711)、输入连接点708和输出连接点713。该播放视频组件包括一个组件主体(播放视频717)和输入连接点715。输出连接点506可以和输入连接点708连接。输出连接点713可以和输入连接点715连接。
在一种可能的实现方式中,用户可以选择分解视频组件的输入连接点504进行智能编排。例如右键点击输入连接点504后选择智能编排控件,响应于用户操作,组件编排设计器102中可以显示可以和输入连接点504连接的组件。例如,图7A中示出的读取视频组件702。
这样,当用户不知道组件具体可以和哪一个组件相连接时,可以快速找到可以和该组件连接的其他组件,节约了用户的时间。
下面结合附图对本申请实施例提供的一种APP开发方法进行介绍。图9为本申请实施例 提供的一种APP开发方法的流程示意图。请参见图9,本申请实施例提供的一种APP开发方法具体包括:
S101、电子设备响应于用户创建第一APP的操作,创建第一APP。
电子设备可以检测到用户创建第一APP的操作。用户创建第一APP的操作可以有很多种。例如,用户打开电子设备中提供创建APP的用户界面,然后在该用户界面中创建APP。此处关于创建APP的用户界面可参见图4中示出的用户界面400。用户如何在用户界面中创建APP可参见上文对图4的描述。此处不再赘述。
S102、响应于用户从电子设备的组件工具箱中选择组件的操作,电子设备在组件编排设计器中显示组件,组件为实现特定功能的独立模块;组件由一个组件主体和一个或多个连接点构成,连接点支持一种或多种数据实体类型。
用户从组件工具箱中选择组件的操作可以有多种,例如,用户拖动组件工具箱中的组件,或者用户鼠标双击组件工具箱中的组件。此处对该用户操作不作限定。
组件的连接点支持一种或多种数据实体类型。数据实体类型可以参考上文对表1的介绍,此处不再赘述。这里,开发人员在开发组件时,可以用程序代码对组件的连接点进行定义。例如,定义组件中连接点支持的数据实体类型,以及连接点中包含的函数。这里,开发人员可以定义所有组件的连接点支持相同的多种的函数。例如用于创建数据实体的create函数、用于连接其他连接点的connect函数、用于接收的accept函数、以及用于将接收的数据存储本地的pull函数、用于发送数据的push函数等等。
在一种可能的实现方式中,响应于用户从电子设备的组件工具箱中选择组件的操作,电子设备显示多个组件的图形具体包括:响应于用户从电子设备的组件工具箱中选择组件的操作,电子设备获取组成第一APP的多个组件的第一文件,解析第一文件后绘制出组件的图形;显示多个组件的图形,第一文件为描述组件功能以及属性的程序代码。用户从电子设备的组件工具箱中选择组件的操作用于选定多个组件,组件的图形用于展示组件的主体和连接点,组件包含一个主体、一个或多个连接点,例如图5A中示出的分解视频组件的图形。用户从电子设备的组件工具箱中选择组件的操作可以是将组件工具箱101中的组件拖到组件编排设计器102中。用户从电子设备的组件工具箱中选择组件的操作还可以是单击或双击选定的组件等等,本申请实施例对第二用户操作不作限定。
在本申请实施例的一个可行的例子中,电子设备可接收用户从工具箱拖拽组件的用户操作。响应于该用户操作,电子设备获取该组件的文件,并解析组件的文件。然后电子设备绘制出组件的图形。具体地,电子设备可通过电子设备中的组件编排设计器102获取组件的文件,解析组件的文件以及绘制出组件的图形。这里可参考上文对图5A的描述,此处不再赘述。
这里,用户会根据自己所开发APP的需求来选择组件。例如,一个可以支持多国语言的视频播放APP,这个APP可以由读取视频文件的组件、分解视频组件、语音翻译组件、高清增强组件、音频播放组件、视频播放组件等组件组成,例如图7A示出的组成视频APP的组件。用户从组件工具箱101中选定组件后,电子设备就获取组件的第一文件。电子设备可以从第一文件中解析出组件包含连接点的数量,以及连接点属性、连接点能传递数据实体的属性等等。电子设备根据第一文件解析出的组件包含的连接点数量将组件绘制成可视化的组件图形。用户可以在用户界面中看到组件图形。如图5A示出的分解视频组件的组件图形。
在一种可能的实现方式中,电子设备接收用于显示第一组件的组件调用树的用户操作, 响应于该用户操作,电子设备显示出第一组件的组件调用树;组件调用树用于展示第一组件所有匹配的第二组件和/或第三组件、以及与第二组件匹配的第四组件和/或第五组件、一直到与第M组件匹配的第N组件,且第N组件为无输出连接点的组件。具体地,电子设备根据第一组件的输出连接点支持的数据实体类型,查找到相匹配的第二组件和/或第三组件。即第二组件、第三组件的输入连接点支持的数据实体类型与第一组件输出连接点支持的数据实体类型匹配。然后,电子设备再依次根据第二组件输出连接点支持的数据实体类型,查找与第二组件相匹配的第四组件和/或第五组件。第四组件和第五组件的输入连接点支持的数据实体类型与第二组件输出连接点支持的数据实体类型相匹配。电子设备会查找出组件调用树中每一个组件相匹配的组件,直到组件调用树的最后一层组件为无输出连接点。具体可以参考图8A中对组件1的调用树的描述,此处不再赘述。
显示组件调用树的用户操作可以有很多种,示例性地,如图8C中所示,用户可以点击智能编排控件508。本申请实施例对显示组件调用树的用户操作不作限定。当用户不知道组件工具箱101中的哪一个组件与组件1相匹配时,用户可以选中组件1的输出连接点,然后点击右键,选择智能编排。电子设备响应于用户点击智能编排的操作,显示出组件1的组件调用树。这样,用户可以在组件调用树中快速选择开发APP所需要的组件。因而,可以节约用户的时间,提升用户体验。
在一种可能的实现方式中,电子设备在组件编排设计器中显示出第一组件的组件调用树,具体包括:电子设备根据第一组件的功能和/或第一组件连接点支持的数据实体类型,在组件编排设计器中显示出第一组件的组件调用树。
在一种可能的实现方式中,响应于用户删除组件的操作,电子设备删除所述组件调用树种的组件。
在一种可能的实现方式中,响应于用户上传组件或从组件市场下载组件的操作,电子设备在组件工具箱中显示用户上传或从组件市场下载的组件的名称。
在一种可能的实现方式中,响应于用户查看第一组件的第一连接点属性的操作,电子设备在组件编排设计器中显示第一连接点支持的数据实体类型。
S103、响应于用户连接多个组件的操作,电子设备在组件编排设计器中连接两个或多个组件。
电子设备可以接收用户连接多个组件的操作,用户连接多个组件的操作,可以有很多种。示例性地,用户连接多个组件的操作可以是用户将第二组件的输出连接点拖向第一组件的输入连接点。用户连接多个组件的操作还可以是用户将第二组件向第一组件的方向滑动。或者,用户连接多个组件的操作还可以是用户输入第一组件的输出连接点和第二组件的输入连接点。本申请实施例对第三用户操作不做限定。这里可以参考上文对图6A的描述,此处不再赘述。
在一种可能的实现方式中,电子设备在组件编排设计器中连接两个或多个组件,具体包括:响应于用户连接第一组件和第二组件的操作,电子设备通过组件编排设计器验证第一组件和第二组件是否匹配;若第一组件和第二组件匹配,则电子设备连接第一连接点和第二连接点,第一连接点为第一组件的连接点,第二连接点为第二组件的连接点。
在一种可能的实现方式中,响应于用户连接多个组件的操作,电子设备会执行将第一组件和第二组建建立连接的过程。首先,电子设备会获取第一组件的输出连接点支持的数据实体类型,以及第二组件的输入连接点支持的数据实体类型。电子设备确定第一组件的输出连 接点输出的数据实体的类型和第二组件的输入连接点的数据实体的类型相匹配时,电子设备会将第一组件和第二组件建立连接。即第一组件的输出连接点支持的数据实体和第二组件的输入连接点支持的数据实体的类型相同,或第一组件的输出连接点支持的数据实体包括第二组件的输入连接点支持的数据实体的类型,或第二组件的输入连接点的支持数据实体的类型包括第一组件的输出连接点支持的数据实体类型时,第一组件才可以和第二组件建立连接。这里可以参考上文对图6A的描述,此处不再赘述。
举例来说,如表3所示,第一组件输出连接点支持数据实体类型为MPEG格式、尺寸为640*480的视频数据。第二组件输入连接点支持数据实体类型为MPEG格式、尺寸为640*480、1080P、2K的视频数据。
表3
Figure PCTCN2021098215-appb-000006
这样,第一组件的输出连接点支持的数据类型和第二组件的输入连接点支持的数据类型相匹配。电子设备将第一组件和第二组件建立连接。这里数据实体的类型可以参考表1。
在一种可能的实现方式中,电子设备确定两个组件可以建立连接后会展示连接成功标识。连接成功标识可以是两个组件之间的连接线(Connection)。例如,如图6A示出的用户界面60A。用户界面60A中示出的分解视频组件和播放视频组件建立连接后,电子设备绘制出连接线,该连接线用于连接分解视频组件输出连接点(Video Output ConnectPoint)和播放视频输入连接点(Input ConnectPoint2)。开发人员按照APP的逻辑流程,一个一个的从组件工具箱101中选定组件并拖动到组件设计编排器102中。电子设备响应于用户操作依次将组成APP的所有组件建立连接。
在一种可能的实现方式中,连接成功标识可以是两个组件相连的两个连接点折叠或重叠。这里可参考上文对图6B的描述,此处不再赘述。
在一种可能的实现方式中,若第一组件的输出连接点输出的数据实体的类型和第二组件的输入连接点的数据实体的类型不匹配,电子设备显示提示框,提示框用于提示用户第一组件个第二组件连接失败。提示框的提示内容可以有很多种。示例性,提示框的内容可以是“连接失败”。提示框的内容还可以是“数据实体的类型不匹配”。提示框的内容还可以是“第一组件和第二组件不能连接”等等,此处对提示框的具体内容不作限定。
S104、响应于用户选择编译的两个或多个组件的操作,电子设备将完成连接的多个组件编译成第一APP的程序代码。
当组成第一APP的组件都已经完成连接后,用户可以点击用于将连接完成的组件生产程序代码的控件。电子设备可以检测到用户点击该控件的操作。响应于该操作,电子设备将完成连接的多个组件编译成第一APP的程序代码。可以理解的是,第一APP的程序代码用于描述第一APP的逻辑功能以及用户界面。当用户的电子设备中安装有第一APP的程序代码时,电子设备可以运行第一APP。
具体地,电子设备可以通过代码生成引擎103将组建编排设计器102中连接的组件生成第一APP的程序代码。
在一种可能的实现方式中,电子设备在组件编排设计器中保存完成连接的两个或多个组件的编排模型图,以及编排模型图中的第一信息;第一信息包括中两个或多个组件的ID、名称、两个或多个组件的连接点支持的数据实体类型中的一项或多项。
在一种可能的实现方式中,电子设备在代码生成引擎中,根据编排模型图、第一信息和组件调用模板生成第一APP可执行的源代码,组件调用模板中包括预设格式的程序代码。具体可参考上文中对组件调用模板的描述,此处不再赘述。
本申请实施例提供的一种APP开发方法,通过该方法,用户可以将工具箱中的多个组件连接组成APP。在多个组件连接的过程中,电子设备需要确定相连的两个组件的连接点支持的数据实体类型是否匹配。若匹配,电子设备可以显示连接成功标识。最后,电子设备将完成的多个组件生成APP的程序代码。这样,用户可以通过已有组件快速开发APP,缩短用户开发APP的时间。
下面介绍根据本申请实施例提供的一种APP开发方法开发的APP的应用场景。图10示出了本申请开发出的APP的应用场景。如图10所示,以一款由通信组件、存储组件、播放组件、摄像组件、手势输入组件、音频播放组件、媒体下载加速组件组成的APP为例进行阐述。用户A具有手机、TV、PC、路由、音箱、手表、车机等电子设备。用户的所有电子设备均连接了同一个无线网络(例如,家庭中的Wi-Fi网络)。用户的手机中安装了包含通信组件、存储组件、播放组件、摄像组件、手势输入组件、音频播放组件、媒体下载加速组件的APP。PC中安装了存储组件。路由器中安装了媒体下载加速组件。音箱中安装有音频播放组件。手表中安装了手势输入组件。车机安装了摄像组件。TV安装了播放组件。那么,当用户在手机中运行该APP时,可以由手机来执行通信组件的功能,选择TV执行播放组件的功能来播放该APP中的视频,选择PC来运行存储组件的功能来存储该APP中的数据,选择路由器来执行媒体下载加速组件来加速下载该APP中的媒体文件,选择音箱执行音频播放的功能来播放该APP中的音频,选择手表执行手势输入组件来输入手势控制该APP,选择车机执行摄像组件的功能来拍摄该APP需要的图像或视频。这样,组成APP的组件可以分别运行在不同的电子设备上,用户可以使用不同电子设备来执行该APP的多个功能。这样,各个电子设备可以发挥自身的长处(例如,TV的显示屏比手机的显示屏大,手机通信更便捷等等),使得用户使用该APP时体验更佳。
本申请实施例中工具箱以及开发者根据本申请实施例提供的方法开发的复合组件、APP可以形成分布式的组件开发生态。图11示出了本申请实施例提供组件开发生态示意图。如图11所示,组件开发者可以从组件市场中查询和调用组件。组件开发者还可以从已有的应用程序中归纳和提炼出组件。组件开发者可以将开发的组件或者从应用程序中归纳提炼出的组件上传到组件市场。这样就形成了一个组件开发生态。组件开发者可以极其便利地利用组件市场中的组件来开发APP。上文中的组件工具箱101可以从组件市场中下载组件。这样可以更新扩充组件工具箱101中的组件。
在一种可能的实现方式中,电子设备根据用户输入的领域描述语言(Domain Specific Language,DSL)调用各个组件以及将组件进行连接。图12示出了组件DSL语言与组件的 对应关系示意图。如图12所示,图中的DSL语言(Comp1;ConnPoint1)的含义是有一个组件1,组件1包括连接点1。这就对应着组件图形中的组件1和连接点1。DSL语言(Comp2;ConnPoint2)的意思是有一个组件2,这个组件2包括连接点2。这就对应着组件图形中的组件2和连接点2。DSL语言(link entity1,entity2)表明连接点1支持数据实体1,连接点支持数据实体2。若连接点1和连接点2相连,那么连接点1需要查看数据实体1和数据实体2是否相同。同样地,连接点2也需要查看数据实体1和数据实体2是否相同。这样,当用户对组件的DSL语言非常熟悉时,用户可以不按照图9示出的APP开发方法流程来开发APP。用户可以在电子设备中编写DSL来开发APP。这样,用户可以更高效地开发APP。
举例来说,用户可以直接利用DSL语言将图6A示出的分解视频组件和播放视频组件进行连接。即不用执行图4-图5G中的操作。分解视频组件和播放视频组件进行连接的DSL语言可以如下:
Compomentdef
Videosplit://分解视频组件
Connpoint1.entity1(vedio,MPEG4,1920*1080)//分解视频组件的连接点1和该连接点支持的数据实体1
Videoplayer://视频播放组件
Connpoint2.entity2(vedio,MPEG4,1920*1080)://视频播放组件的连接点2和该连接点支持的数据实体2
Link entity1,entity2//建立连接点1和连接点2之间的连接
在上述示出的DSL代码中,“Videosplit”即可表示图6A中的分解视频组件,“Connpoint1”对应图6A中的输出连接点506。“entity1(vedio,MPEG4,1920*1080)”表示输出连接点506支持的数据实体类型。“Videoplayer”即可表示图6A中的视频播放组件,“Connpoint2”对应图6A中的输入连接点602。“entity2(vedio,MPEG4,1920*1080)”表示输出连接点602支持的数据实体类型。
在一种可能的实现方式中,用户可以用DSL语言写出判断条件,例如图7A中示出的连接点707可以和连接点709连接。连接点707也可以和连接点714连接点。用户可以对连接点707设置判断条件,该条件判断是连接点707支持的语言是否和电子设备的本地语言一致。若是,则视频分解组件的输出通过输出连接点707输入音频播放组件的输入连接点714。若否,则视频分解组件的输出通过输出连接点707输入语言翻译组件710的输入连接点709。对于连接点707的判断条件可以通过如下DSL语言实现:
Condition[1]://条件1
State type:if propertyname[language]contain[local language]//如果连接点支持的语言属性中包括本地语言
Result type:connect-branch1//连接线1
Condition[2]://
State type:if propertyname[language]not contain[local language]//如果连接点支持的语言属性中不包括本地语言
Result type:connect-branch2//连接线2
上述DSL语言中,connect-branch1可以表示连接点707与连接点714之间的连接线。connect-branch2可以表示连接点707与连接点709之间的连接线。可以理解的是,在上述DSL 语言之前,用户可以定义connect-branch1以及connect-branch2具体代表哪两个连接点之间的连接线。此处,DSL语言仅为示例,本申请对DSL语言实现不做限定。
在使用DSL时,用户可以只写出两个组件中需要建立连接的连接点。如图6A中的分解视频组件的输入连接点504和输出连接点505可以不用写出来。这样,用户只需要输入几行代码就可以实现组件之间的连接,节约了用户的时间。
接着介绍本申请以下实施例中提供的示例性电子设备100。
图13示出了电子设备100的结构示意图。
下面以电子设备100为例对实施例进行具体说明。应该理解的是,电子设备100可以具有比图中所示的更多的或者更少的部件,可以组合两个或多个的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实现。
电子设备100可以包括处理器101,存储器102,收发器103,显示屏104、传感器105等,其中:
处理器101可以用于获取组件的连接点支持的数据实体类型,以及判断两个组件的连接点支持的数据实体类型是否匹配,还用于根据用户操作查找与组件的输出连接点支持数据实体类型相匹配的组件。
在一些实施例中,处理器101可以包括一个或多个处理单元,例如:处理器101可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器101中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器101中的存储器为高速缓冲存储器。该存储器可以保存处理器101刚用过或循环使用的指令或数据。如果处理器101需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器101的等待时间,因而提高了系统的效率。
在一些实施例中,处理器101可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器101可以包含多组I2C总线。处理器101可以通过不同的I2C总线接口分别耦合触摸传感器,充电器,闪光灯,摄像头193等。例如:处理器101可以通过I2C接口耦合触摸传感器,使处理器101与触摸传感器通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器101可以包含多组I2S总线。处理 器101可以通过I2S总线与音频模块170耦合,实现处理器101与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器101与无线通信模块。例如:处理器101通过UART接口与无线通信模块中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器101与显示屏104,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器101和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器101和显示屏104通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器101与摄像头193,显示屏104,无线通信模块,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在本申请另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
存储器102可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器101通过运行存储在存储器102的指令,从而执行电子设备100的各种功能应用以及数据处理。存储器102可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,存储器102可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
收发器103可以用于与网络设备、其他电子设备进行通信。电子设备100可以通过收发器103上传或者下载组件。在一些实施例中,收发器103可以包括移动通信模块(图中未示出)和无线通信模块(图中未示出),其中:
移动通信模块可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块可以由天线1接收电磁波,并对接收的电磁波进行滤波, 放大等处理,传送至调制解调处理器进行解调。移动通信模块还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块的至少部分功能模块可以被设置于处理器101中。在一些实施例中,移动通信模块的至少部分功能模块可以与处理器101的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器,受话器等)输出声音信号,或通过显示屏104显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器101,与移动通信模块或其他功能模块设置在同一个器件中。
无线通信模块可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器101。无线通信模块还可以从处理器101接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备100通过GPU,显示屏104,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏104和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器101可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏104可以用于显示组件的图形、以及工具箱中、组件编排设计器等。显示屏104包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏104,N为大于1的正整数。
传感器105可以用于检测用户操作,例如,用户拖动组件的操作,用户滑动组件的操作等等。传感器105可以包括压力传感器和触摸传感器,其中:
压力传感器用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器可以设置于显示屏104。压力传感器的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏104,电子设备100根据压力传感器检测所述触摸操作强度。电子设备100也可以根据压力传感器的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时, 执行新建短消息的指令。
触摸传感器,也称“触控面板”。触摸传感器可以设置于显示屏104,由触摸传感器与显示屏104组成触摸屏,也称“触控屏”。触摸传感器用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏104提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器也可以设置于电子设备100的表面,与显示屏104所处的位置不同。
图14为本申请实施例提供的一种电子设备200的示意性框图。如图14所示,该电子设备200可以包括检测单元201、处理单元202,显示单元203。其中,
检测单元201,用于检测电子设备200接收到的用户操作,例如用户从组件工具箱中拖动组件,用户将第二组件的输入连接点拖向第一组件的输出连接点等等。
处理单元202,用于响应于检测单元201检测到的用户操作,获取组件的连接点支持的数据实体类型、确定第一组件的输出连接点与第二组件的输入连接点匹配。
显示单元203,用于显示组件的图形,以及组件的连接点支持的数据实体类型、两个组件连接成功的标识。
本申请实施例的用户设备200中各单元和上述其它操作或功能分别为了APP开发方法中由电子设备执行的相应流程,此处不再赘述。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (27)

  1. 一种应用程序APP开发平台,其特征在于,应用于电子设备,所述APP开发平台包括组件工具箱,组件编排设计器和代码生成引擎,其中:
    所述组件工具箱用于提供组件,所述组件为实现特定功能的独立模块,所述组件由一个组件主体和一个或多个连接点构成,所述连接点支持一种或多种数据实体类型;
    所述组件编排设计器用于显示所述组件,根据用户连接组件的操作连接两个或多个组件;
    所述代码生成引擎用于将所述组件编排设计器中连接的所述两个或多个组件生成第一APP可执行的源代码,所述第一APP包括所述两个或多个组件。
  2. 根据权利要求1所述的APP开发平台,其特征在于,所述两个或多个组件中包含第一组件和第二组件,所述组件编排设计器还用于:
    响应于所述用户连接所述第一组件和所述第二组件的操作,验证所述第一组件和所述第二组件是否匹配;
    若所述第一组件和所述第二组件匹配,则连接第一连接点和第二连接点,所述第一连接点为所述第一组件的连接点,所述第二连接点为所述第二组件的连接点。
  3. 根据权利要求2所述的APP开发平台,其特征在于,所述第一组件和所述第二组件匹配包括:第一数据实体类型与第二数据实体类型相同、所述第一数据实体类型包括所述第二数据实体类型、或者所述第二数据实体类型包括所述第一数据实体类型,其中,所述第一数据实体类型为所述第一连接点支持的数据实体的类型,所述第二数据实体为所述第二连接点支持的数据实体的类型。
  4. 根据权利要求1所述的APP开发平台,其特征在于,所述组件工具箱还用于:
    响应于所述用户上传组件或从组件市场下载组件的操作,显示用户上传或从所述组件市场下载的组件的名称。
  5. 根据权利要求2所述的APP开发平台,其特征在于,所述组件编排设计器还用于:
    响应于所述用户查看所述第一连接点属性的操作,显示所述第一连接点支持的数据实体类型。
  6. 根据权利要求5所述的APP开发平台,其特征在于,所述组件编排设计器具体用于:
    响应于所述用户连接所述第一连接点和所述第二连接点的操作,显示连接所述第一连接点和所述第二连接点的连接线。
  7. 根据权利要求5所述的APP开发平台,其特征在于,所述组件编排设计器具体用 于:
    响应于所述用户连接所述第一连接点和所述第二连接点的操作,将所述第一连接点和所述第二连接点重叠显示。
  8. 根据权利要求1所述的APP开发平台,其特征在于,所述组件编排设计器具体用于:
    根据所述用户从所述组件工具箱中选择所述第一组件的操作,显示所述第一组件。
  9. 根据权利要求8所述的APP开发平台,其特征在于,所述组件编排设计器还用于:
    响应于所述用户选择对所述第一组件进行智能编排的操作,显示出所述第一组件的组件调用树;所述组件调用树用于展示与所述第一组件匹配的所述第二组件和/或第三组件、以及与所述第二组件匹配的第四组件和/或第五组件、直到与第M组件匹配的第N组件,且所述第N组件为无输出连接点的组件,其中,M和N为正整数。
  10. 根据权利要求9所述的APP开发平台,其特征在于,所述显示出所述第一组件的组件调用树,具体为:
    根据所述第一组件的功能和/或所述第一组件的连接点支持的数据实体类型,显示出所述第一组件的组件调用树。
  11. 根据权利要求10所述的APP开发平台,其特征在于,所述组件编排设计器还用于:
    响应于所述用户删除所述第二组件的操作,删除所述组件调用树中的所述第二组件。
  12. 根据权利要求1-11任一项所述APP开发平台,其特征在于,所述组件编排设计器还用于:
    保存完成连接的所述两个或多个组件的编排模型图,以及所述编排模型图中的第一信息;所述第一信息包括中所述两个或多个组件的ID、名称、所述两个或多个组件的连接点支持的数据实体类型中的一项或多项。
  13. 根据权利要求12所述的APP开发平台,其特征在于,所述代码生成引擎具体用于:
    根据所述编排模型图、所述第一信息和组件调用模板生成所述第一APP可执行的源代码,所述组件调用模板中包括预设格式的程序代码。
  14. 一种APP开发方法,其特征在于,包括:
    响应于用户从电子设备的组件工具箱中选择组件的操作,所述电子设备在组件编排设计器中显示所述组件,所述组件为实现特定功能的独立模块;所述组件由一个组件主体和一个或多个连接点构成,所述连接点支持一种或多种数据实体类型;
    响应于用户连接多个组件的操作,所述电子设备在所述组件编排设计器中连接两个或多个组件;
    响应于用户选择编译所述两个或多个组件的操作,所述电子设备在代码生成引擎中将连接的所述两个或多个组件生成第一APP的可执行的源代码。
  15. 根据权利要求14所述的方法,其特征在于,所述两个或多个组件中包含第一组件和第二组件,所述电子设备在所述组件编排设计器中连接两个或多个组件,具体包括:
    响应于所述用户连接所述第一组件和所述第二组件的操作,所述电子设备通过所述组件编排设计器验证所述第一组件和所述第二组件是否匹配;
    若所述第一组件和所述第二组件匹配,则所述电子设备连接第一连接点和第二连接点,所述第一连接点为所述第一组件的连接点,所述第二连接点为所述第二组件的连接点。
  16. 根据权利要求15所述的方法,其特征在于,所述第一组件和所述第二组件匹配包括:第一数据实体类型与第二数据实体类型相同、所述第一数据实体类型包括所述第二数据实体类型、所述第二数据实体类型包括所述第一数据实体类型,所述第一数据实体类型为所述第一连接点支持的数据实体的类型,所述第二数据实体为所述第二连接点支持的数据实体的类型。
  17. 根据权利要求14所述的方法,其特征在于,所述方法还包括:
    响应于所述用户上传组件或从组件市场下载组件的操作,所述电子设备在所述组件工具箱中显示用户上传或从所述组件市场下载的组件的名称。
  18. 根据权利要求15所述的方法,其特征在于,所述方法还包括:
    响应于所述用户查看所述第一组件的所述第一连接点属性的操作,所述电子设备在所述组件编排设计器中显示所述第一连接点支持的数据实体类型。
  19. 根据权利要求15所述的方法,其特征在于,所述电子设备连接第一连接点和第二连接点包括:
    显示连接所述第一连接点和所述第二连接点的连接线;
    或,将所述第一连接点和所述第二连接点重叠。
  20. 根据权利要求19所述的方法,其特征在于,所述方法还包括:
    响应于所述用户选择对所述第一组件进行智能编排的操作,所述电子设备在所述组件编排设计器中显示出所述第一组件的组件调用树;所述组件调用树用于展示所述第一组件匹配的所述第二组件和/或第三组件、以及与所述第二组件匹配的第四组件和/或第五组件、直到与第M组件匹配的第N组件,且所述第N组件为无输出连接点的组件,其中,M和N为正整数。
  21. 根据权利要求20所述的方法,其特征在于,所述电子设备在所述组件编排设计器中显示出所述第一组件的组件调用树,具体包括:
    所述电子设备根据所述第一组件的功能和/或所述第一组件的连接点支持的数据实体类型,在所述组件编排设计器中显示出所述第一组件的组件调用树。
  22. 根据权利要求21所述的方法,其特征在于,所述方法还包括:
    响应于所述用户删除所述第二组件的操作,所述电子设备删除所述组件调用树中的所述第二组件。
  23. 根据权利要求14-22任一项所述的方法,其特征在于,所述方法还包括:
    所述电子设备在所述组件编排设计器中保存完成连接的所述两个或多个组件的编排模型图,以及所述编排模型图中的第一信息;所述第一信息包括中所述两个或多个组件的ID、名称、所述两个或多个组件的连接点支持的数据实体类型中的一项或多项。
  24. 根据权利要求23所述的方法,其特征在于,所述方法还包括:
    所述电子设备在所述代码生成引擎中,根据所述编排模型图、所述第一信息和组件调用模板生成所述第一APP可执行的源代码,所述组件调用模板中包括预设格式的程序代码。
  25. 一种电子设备其特征在于,包括:一个或多个处理器、一个或多个存储器;所述一个或多个存储器分别与所述一个或多个处理器耦合;所述一个或多个存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;当所述计算机指令在所述处理器上运行时,使得所述电子设备执行如权利要求14-24中任一项所述的APP开发方法。
  26. 一种计算机可读存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求14-24中任一项所述的APP开发方法。
  27. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求14-24中任一项所述的APP开发方法。
PCT/CN2021/098215 2020-06-20 2021-06-03 App开发平台、app开发方法及电子设备 WO2021254167A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010569877.8A CN113821203A (zh) 2020-06-20 2020-06-20 App开发平台、app开发方法及电子设备
CN202010569877.8 2020-06-20

Publications (1)

Publication Number Publication Date
WO2021254167A1 true WO2021254167A1 (zh) 2021-12-23

Family

ID=78924852

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/098215 WO2021254167A1 (zh) 2020-06-20 2021-06-03 App开发平台、app开发方法及电子设备

Country Status (2)

Country Link
CN (2) CN113821203A (zh)
WO (1) WO2021254167A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114461208A (zh) * 2022-01-06 2022-05-10 深圳安巽科技有限公司 一种软件自动化编排方法、系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113821203A (zh) * 2020-06-20 2021-12-21 华为技术有限公司 App开发平台、app开发方法及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101944017A (zh) * 2009-07-09 2011-01-12 华为技术有限公司 一种Widget的制作方法及其制作装置
CN101971143A (zh) * 2008-02-05 2011-02-09 奥多比公司 应用组件之间的自动连接
US20140282371A1 (en) * 2013-03-14 2014-09-18 Media Direct, Inc. Systems and methods for creating or updating an application using a pre-existing application
CN105512304A (zh) * 2015-12-11 2016-04-20 西安道同信息科技有限公司 在线生成互联网应用方法和系统集成方法及支撑平台
CN110187875A (zh) * 2019-05-28 2019-08-30 深圳市智慧郎数码科技有限公司 一种组件可视化开发方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7716636B2 (en) * 2005-01-10 2010-05-11 Microsoft Corporation User interface for accessing and loading software components of a development set on a computer while other software components of the set are loading
CN101361041B (zh) * 2006-02-01 2012-03-21 富士通株式会社 对象关系显示装置以及对象关系显示方法
CN102087597B (zh) * 2011-02-14 2014-08-20 浪潮通信信息系统有限公司 一种基于j2ee和构件集的可视化开发平台
CN102799430B (zh) * 2012-07-02 2015-07-15 电子科技大学 一种面向移动互联网的离线可视化业务开发生成器
US9947140B2 (en) * 2015-09-15 2018-04-17 Sartorius Stedim Biotech Gmbh Connection method, visualization system and computer program product
CN106557314A (zh) * 2016-10-19 2017-04-05 深圳智慧林网络科技有限公司 应用软件开发方法及装置
CN107844299B (zh) * 2017-12-01 2021-01-22 浪潮软件股份有限公司 一种Web应用开发工具的实现方法
CN111258569A (zh) * 2020-01-09 2020-06-09 卓望数码技术(深圳)有限公司 网页组件编辑方法、装置、设备和计算机可读存储介质
CN113821203A (zh) * 2020-06-20 2021-12-21 华为技术有限公司 App开发平台、app开发方法及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101971143A (zh) * 2008-02-05 2011-02-09 奥多比公司 应用组件之间的自动连接
CN101944017A (zh) * 2009-07-09 2011-01-12 华为技术有限公司 一种Widget的制作方法及其制作装置
US20140282371A1 (en) * 2013-03-14 2014-09-18 Media Direct, Inc. Systems and methods for creating or updating an application using a pre-existing application
CN105512304A (zh) * 2015-12-11 2016-04-20 西安道同信息科技有限公司 在线生成互联网应用方法和系统集成方法及支撑平台
CN110187875A (zh) * 2019-05-28 2019-08-30 深圳市智慧郎数码科技有限公司 一种组件可视化开发方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114461208A (zh) * 2022-01-06 2022-05-10 深圳安巽科技有限公司 一种软件自动化编排方法、系统

Also Published As

Publication number Publication date
CN113821203A (zh) 2021-12-21
CN114371844B (zh) 2022-09-23
CN114371844A (zh) 2022-04-19

Similar Documents

Publication Publication Date Title
WO2021052147A1 (zh) 一种数据传输的方法及相关设备
WO2020211709A1 (zh) 一种添加批注的方法及电子设备
WO2021164313A1 (zh) 界面布局方法、装置及系统
WO2021129253A1 (zh) 显示多窗口的方法、电子设备和系统
WO2021139768A1 (zh) 跨设备任务处理的交互方法、电子设备及存储介质
WO2022052772A1 (zh) 多窗口投屏场景下的应用界面显示方法及电子设备
CN110597512B (zh) 显示用户界面的方法及电子设备
WO2021027476A1 (zh) 语音控制设备的方法及电子设备
WO2021082835A1 (zh) 启动功能的方法及电子设备
WO2021254167A1 (zh) App开发平台、app开发方法及电子设备
WO2021121052A1 (zh) 一种多屏协同方法、系统及电子设备
WO2023130921A1 (zh) 一种适配多设备的页面布局的方法及电子设备
WO2022057852A1 (zh) 一种多应用程序之间的交互方法
WO2023109764A1 (zh) 一种壁纸显示方法及电子设备
WO2021169466A1 (zh) 信息收藏方法、电子设备及计算机可读存储介质
CN106385446A (zh) 一种文件传输方法、终端及系统
KR20210105938A (ko) 이미지 분류 방법 및 전자 디바이스
WO2022161024A1 (zh) 升级提示方法、终端设备及计算机可读存储介质
CN115700461A (zh) 投屏场景下的跨设备手写输入方法、系统和电子设备
WO2022142674A1 (zh) 一种快捷方式的创建方法及相关设备
CN115941674B (zh) 多设备应用接续方法、设备及存储介质
CN115113832A (zh) 一种跨设备同步显示的控制方法及系统
CN114564214A (zh) 显示设备、应用安装方法及存储介质
CN112786022A (zh) 终端、第一语音服务器、第二语音服务器及语音识别方法
WO2024140002A1 (zh) 一种存储空间的管理方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21826441

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21826441

Country of ref document: EP

Kind code of ref document: A1