US20160170720A1 - Bi-directional editing between visual screen designer and source code - Google Patents
Bi-directional editing between visual screen designer and source code Download PDFInfo
- Publication number
- US20160170720A1 US20160170720A1 US14/566,537 US201414566537A US2016170720A1 US 20160170720 A1 US20160170720 A1 US 20160170720A1 US 201414566537 A US201414566537 A US 201414566537A US 2016170720 A1 US2016170720 A1 US 2016170720A1
- Authority
- US
- United States
- Prior art keywords
- code
- source code
- tool
- screen
- block
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000000007 visual effect Effects 0.000 title claims abstract description 28
- 238000013461 design Methods 0.000 claims abstract description 60
- 238000011161 development Methods 0.000 claims abstract description 52
- 230000008859 change Effects 0.000 claims abstract description 32
- 238000000034 method Methods 0.000 claims description 29
- 230000004075 alteration Effects 0.000 claims description 7
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 238000012217 deletion Methods 0.000 claims description 5
- 230000037430 deletion Effects 0.000 claims description 5
- 230000003993 interaction Effects 0.000 claims description 4
- 238000009877 rendering Methods 0.000 claims description 4
- 230000007704 transition Effects 0.000 claims description 4
- 230000004044 response Effects 0.000 claims 6
- 230000015654 memory Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 230000005291 magnetic effect Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
Definitions
- This document generally relates to methods and systems for use with computer networks. More particularly, this document relates to an extensibility framework for use with dynamic computer programming languages.
- Enterprise mobility platforms manage the whole life cycle of applications for an enterprise, including application development, application deployment, application execution, application management and mobile device management.
- enterprise including application development, application deployment, application execution, application management and mobile device management.
- enterprise In a large corporation with tens of thousands of employees, multiple business lines, and millions of transactions daily, employees need to access data and work on their tasks while they are in different statuses.
- Enterprise mobility platforms enable business analysts and developers to quickly develop mobile applications with specific business objectives and functionality and deploy the applications, allowing other employees to use the applications on their devices to process data and information.
- FIG. 1 is a block diagram illustrating an enterprise mobility platform system in accordance with an example embodiment.
- FIG. 2 is a screen capture illustrating a screen design page of an application development tool in accordance with an example embodiment.
- FIG. 3 is a screen capture illustrating the code preview page of an application development tool in accordance with an example embodiment.
- FIG. 4 is a screen capture illustrating the screen design page of an application development tool after the start screen has been edited in accordance with an example embodiment.
- FIG. 5 is a screen capture illustrating the code preview page of an application development tool after the start screen has been edited in accordance with an example embodiment.
- FIG. 6 is a screen capture illustrating the code preview page of an application development tool after the code has been manually edited in accordance with an example embodiment.
- FIG. 7 is a screen capture illustrating the screen design page of an application development tool in light of the manual update of the source code in accordance with an example embodiment.
- FIG. 8 is a screen capture illustrating the screen design page of an application development tool after the label has been removed.
- FIG. 9 is a screen capture illustrating the code preview page of an application development tool after the label has been removed.
- FIG. 10 is a screen capture illustrating the code preview page of an application development tool after the OK button has been removed.
- FIG. 11 is a screen capture illustrating the screen design page of an application development tool in light of the manual update of the source code to remove the OK button in accordance with an example embodiment.
- FIG. 12 is a block diagram illustrating an application development tool in accordance with an example embodiment.
- FIG. 13 is a flow diagram illustrating a method of altering a hybrid application in an application development tool in accordance with an example embodiment.
- FIG. 14 is a block diagram illustrating a mobile device, according to an example embodiment.
- FIG. 15 is a block diagram of machine in the example form of a computer system within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.
- aspects are provided that allow for bi-directional editing between a visual screen designer in an enterprise mobility platform and source code. This helps reduce the difficulty a programmer typically has in manually writing source code and viewing the effects of changes in the source code.
- FIG. 1 is a block diagram illustrating an enterprise mobility platform system 100 in accordance with an example embodiment.
- the enterprise mobility platform system 100 may include an application development tool 102 , a hybrid application 104 , a server 106 , and a hybrid web container 108 .
- the application development tool 102 (also sometimes known as a designer) provides a visual graphical interface to allow a programmer to develop the hybrid application 104 .
- the hybrid application 104 is deployed to the server 106 .
- a client device 110 downloads the hybrid application 104 in the hybrid web container 108 .
- the hybrid web container 108 is a native mobile application that connects to the server 106 to download the appropriate hybrid application 104 and then executes it.
- a back-end database 112 is also provided to allow the hybrid application 104 to access shared data.
- the application development tool 102 is an Eclipse-based integrated development environment (IDE) that creates one or more mobile workflow packages.
- IDE integrated development environment
- a mobile workflow package is a model for screens, widgets on screens and workflows between screens.
- a XBW model is an implementation of a mobile workflow package that defines user interface (screens) and workflow (from screen to screen) of a hybrid application. XBW is not an acronym but rather the name applied to this type of model. The developer is able to visually design screens and workflow using drag-and-drop operations.
- the hybrid application 104 may store mobile business objects (MBOs) and workflow packages.
- An MBO is a software object like a class in programming languages. It encapsulates business logic of a mobile application.
- the MBO also includes attributes that map to data in a data set. When a mobile application requests data, mobile applications use MBOs to retrieve data from the data set.
- the MBO is developed in the application development tool 102 and deployed to the server 106 along with the application.
- the MBO is bound to a data source, such as a data source in the back-end database 112 .
- Mobile workflow packages include web files that enable a hybrid application to execute on mobile devices having different operating platforms. These web files enable a hybrid application to execute on mobile devices having different operating platforms. Additionally, the web files can be used to create mobile applications that include extensive customization at the screen layout and data interaction.
- the server 106 may deploy the hybrid application 104 as web files to the client device 110 .
- the web files may be installed in the hybrid web container 108 .
- a user may manipulate the data.
- the client device 110 may synchronize with the server 106 by returning the manipulated data to the server 106 .
- the manipulated data may be stored as a data subset.
- the server 106 then uses the appropriate MBO to store and retrieve data from the data source (e.g., back-end database 112 ).
- the server 106 retrieves data from the data set, it may store the data as a data subset.
- the server 106 may synchronize with the data source by updating the data set with the manipulated data stored in the data subset.
- a developer is able to design screens one by one in a screen design page and generate JavaScript or other source code from whatever has been designed.
- the JavaScript may include methods that execute business logic of a mobile application. The methods included in the JavaScript may be triggered when a user selects a link or button included in a screen or performs some other interaction with a screen of a mobile application.
- the JavaScript allows a developer to include functionality that activates other features or applications native to the client device 110 .
- a native application is an application that is designed to run on an operating platform the client device 110 . For example, a developer may add methods to a JavaScript file to activate the camera or a calculator on a mobile device.
- FIG. 2 is a screen capture illustrating a screen design page 200 of an application development tool 102 in accordance with an example embodiment.
- the screen design page 200 is enabled by the user pressing on a screen design tab 202 .
- the screen design page 200 is displaying a start screen 204 along with a menu 206 for the start screen 204 and custom actions 208 for the start screen 204 .
- the developer is finished visually designing the screens using the screen design page 200 , he or she would generate JavaScript or other source code from whatever has been designed.
- the designer could manually edit the generated JavaScript or other source code prior to deploying the hybrid application to the server 106 , the manual changes would not iterate through to the mobile workflow package (e.g., XBW model) that defines the visually created screens from the screen design page 200 . Hence, ordinarily the developer would be unable to go back to alter the visual design of the screens once a manual change is made to the source code.
- the mobile workflow package e.g., XBW model
- source code is generated in a framework that deals with user interface rendering. This allows a rendered screen to be modified if the underlying source code is also modified.
- a developer enjoys the convenience of visually creating and editing screens using a screen design page 200 of an application development tool 102 , but can also expect fine-grained control over the user interface by manually editing the source code and seeing the results of such manual edits in the screen design page 200 .
- the source code is generated in UI5, which is a JavaScript framework that deals with UI rendering and many other tasks.
- the application development tool 102 is designed to allow bidirectional editing between a SUP XBW model and UI5 JavaScript source code.
- FIGS. 3-11 depict an example of bi-directional editing between a visual screen designer and source code in accordance with an example embodiment.
- the developer may view the source code corresponding to the visually designed start screen 204 by pressing a code preview tab 210 in the application development tool 102 .
- FIG. 3 is a screen capture illustrating the code preview page 300 of an application development tool 102 in accordance with an example embodiment.
- the code 302 depicted in the code preview page 300 represents JavaScript code generated from the mobile workflow package (e.g., XBW model) from the start screen 204 that was previously designed by the developer.
- FIG. 4 is a screen capture illustrating the screen design page 200 of an application development tool 102 after the start screen 204 has been edited in accordance with an example embodiment.
- the developer has dragged and dropped an OK button 400 into the start screen 204 .
- FIG. 5 is a screen capture illustrating the code preview page 300 of an application development tool 102 after the start screen 204 has been edited in accordance with an example embodiment. As can be seen, the code 500 has been updated to include code 502 for the newly added OK button 400 .
- FIG. 6 is a screen capture illustrating the code preview page 300 of an application development tool 102 after the code 500 has been manually edited in accordance with an example embodiment.
- the developer has added additional code 600 , which is intended to display a new label in the start screen 204 .
- FIG. 7 is a screen capture illustrating the screen design page 200 of an application development tool 102 in light of the manual update of the source code in accordance with an example embodiment.
- the screen design page 200 has been updated to reflect the new label 700 .
- FIG. 8 is a screen capture illustrating the screen design page 200 of an application development tool 102 after the label 702 has been removed. As can be seen, the start screen 204 has been updated so that OK button 400 and the new label 700 have been moved.
- FIG. 9 is a screen capture illustrating the code preview page 300 of an application development tool 102 after the label 702 has been removed. As can be seen, code 900 has been automatically altered to remove the code relating to the label 702 .
- FIG. 10 is a screen capture illustrating the code preview page 300 of an application development tool 102 after the OK button 400 has been removed. As can be seen, the code 1000 has been manually updated by the developer to remove code relating to the OK button 400 .
- FIG. 11 is a screen capture illustrating the screen design page 200 of an application development tool 102 in light of the manual update of the source code to remove the OK button 400 in accordance with an example embodiment.
- the start screen 204 has been updated to reflect the changes made to the source code.
- FIG. 12 is a block diagram illustrating an application development tool 1200 in accordance with an example embodiment.
- the application development tool 1200 is the application development tool 102 of FIG. 1 in more detail.
- the application development tool 1200 may include a screen design tool 1202 and a code preview tool 1204 .
- the screen design tool 1202 may present a visual interface to a user to visually edit screens of a hybrid application, such as the screen design page 200 depicted in FIGS. 2-11 above.
- the code preview tool 1204 may present raw source code in a screen to a user to visually edit the raw source code of the hybrid application, such as the code preview page 300 depicted in FIGS. 3-11 above.
- a bi-directional conversion module 1206 may detect when changes are made in either the screen design tool 1202 or the code preview tool 1204 and may convert and iterate such changes through to the other of the screen design tool 1202 or code preview tool 1204 . This may be accomplished by accessing a library of elements 1208 .
- the library of elements 1208 may contain identifications of a number of supported elements that a developer can add to a hybrid application as well as corresponding code-generating algorithm for each of those supported elements.
- the library of elements 1208 may also contain algorithms to update the mobile workflow package for each of the supported elements.
- the bi-directional conversion module 1206 may act to convert that change to the corresponding source code.
- This may include identifying the screen element being altered (e.g., updated, deleted or added) and obtaining the corresponding code-generating algorithm for that element from the library of elements 1208 .
- the source code can then be modified by applying that element's code-generating algorithm.
- the bi-directional conversion module 1206 may act to convert that change to the corresponding visual element. This may include identifying the element corresponding to that block of source code in the library of elements 1208 and then retrieving the algorithms to update the mobile workflow package for that identified element. The algorithms are applied to update the mobile workflow package. Then, the screen design tool 1202 displays an updated or newly added element in the, or in the case of a deleted element, stops displaying the identified element.
- FIG. 13 is a flow diagram illustrating a method 1300 of altering a hybrid application in an application development tool in accordance with an example embodiment.
- the method 1300 may be performed in a bi-directional conversion module, such as the bi-directional conversion module 1206 of FIG. 12 .
- a change in the application development tool e.g., application development tool 102 of FIG. 1 or application development tool 1200 of FIG. 12
- This change may be a change created by a developer using the application development tool.
- Examples of such changes include alterations of screens in a screen design tool (e.g., screen design tool 1202 ) by, for example, dragging and dropping new elements onto a screen of the hybrid application, removing elements from the screen of the hybrid application, editing elements' properties, and creating or removing transitions between screens in the hybrid application. Additional examples of such changes include adding, removing, or editing source code, in a code preview tool (e.g., code preview tool 1204 ), written in a scripting language that can be rendered as a user interface representation.
- a code preview tool e.g., code preview tool 1204
- a library of elements e.g., library of elements 1208
- a code-generating algorithm corresponding to the changed element.
- a corresponding change is made to the source code for the edited screen. This may include accessing the code-generating algorithm corresponding to the element being changed in the screen design tool and adding, editing, or deleting the source code for the element, as appropriate.
- the algorithm corresponding to the element will generate and insert source code for the element and into the screen's source code at an appropriate place.
- the source code corresponding to the prior version of the element may be removed and the code-generating algorithm corresponding to the element will generate and insert source code for the element in its place (or, alternatively, only code relating to the change within the element itself will be changed).
- the source code may be scanned to identify a block of code matching the element, and then that block of code may be removed from the source code.
- the code preview tool may be caused to display the changed source code.
- the changed source code is identified. This may include identifying a block of source code that has been added to the source code, identifying a block of source code that has been modified, or identifying a block of source code that has been deleted from the source code.
- a library of elements is accessed to retrieve algorithms to update the mobile workflow package for to the element corresponding to the changed block of source code. Thus, if a particular block of source code is added, algorithms to update the mobile workflow package corresponding to the element that is the subject of the block of source code are obtained from the library of elements.
- a corresponding change is made to the mobile workflow package for the source code. This may include locating the element (in the case of editing and deleting) corresponding to the source code being edited in the code preview tool and adding a new element, or editing or deleting the element to update the mobile workflow package for the element, as appropriate.
- the algorithms to update the mobile workflow package corresponding to the element is applied to insert the element into the mobile workflow package at an appropriate place.
- the mobile workflow package may be scanned to identify algorithms to update the mobile workflow package matching the element, and then those algorithms are applied to remove the element from the mobile workflow package.
- the algorithms corresponding to the element are applied to update the mobile workflow package.
- the screen design tool may be caused to display the changed screen.
- FIG. 14 is a block diagram illustrating a mobile device 1400 , according to an example embodiment.
- the mobile device 1400 may include a processor 1402 .
- the processor 1402 may be any of a variety of different types of commercially available processors 1402 suitable for mobile devices 1400 (for example, an XScale architecture microprocessor, a microprocessor without interlocked pipeline stages (MIPS) architecture processor, or another type of processor 1402 ).
- a memory 1404 such as a random access memory (RAM), a flash memory, or other type of memory, is typically accessible to the processor 1402 .
- RAM random access memory
- flash memory or other type of memory
- the memory 1404 may be adapted to store an operating system (OS) 1406 , as well as application programs 1408 , such as a mobile location enabled application that may provide location-based services to a user.
- the processor 1402 may be coupled, either directly or via appropriate intermediary hardware, to a display 1410 and to one or more input/output (I/O) devices 1412 , such as a keypad, a touch panel sensor, a microphone, and the like.
- I/O input/output
- the processor 1402 may be coupled to a transceiver 1414 that interfaces with an antenna 1416 .
- the transceiver 1414 may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via the antenna 1416 , depending on the nature of the mobile device 1400 . Further, in some configurations, a GPS receiver 1418 may also make use of the antenna 1416 to receive GPS signals.
- Modules may constitute either software modules (e.g., code embodied (1) on a non-transitory machine-readable medium or (2) in a transmission signal) or hardware-implemented modules.
- a hardware-implemented module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
- one or more computer systems e.g., a standalone, client or server computer system
- one or more processors e.g., processor 1402
- software e.g., an application or application portion
- a hardware-implemented module may be implemented mechanically or electronically.
- a hardware-implemented module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
- a hardware-implemented module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware-implemented module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- the term “hardware-implemented module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily or transitorily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein.
- hardware-implemented modules are temporarily configured (e.g., programmed)
- each of the hardware-implemented modules need not be configured or instantiated at any one instance in time.
- the hardware-implemented modules comprise a general-purpose processor configured using software
- the general-purpose processor may be configured as respective different hardware-implemented modules at different times.
- Software may accordingly configure a processor, for example, to constitute a particular hardware-implemented module at one instance of time and to constitute a different hardware-implemented module at a different instance of time.
- Hardware-implemented modules can provide information to, and receive information from, other hardware-implemented modules. Accordingly, the described hardware-implemented modules may be regarded as being communicatively coupled. Where multiple of such hardware-implemented modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware-implemented modules). In embodiments in which multiple hardware-implemented modules are configured or instantiated at different times, communications between such hardware-implemented modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware-implemented modules have access. For example, one hardware-implemented module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled.
- a further hardware-implemented module may then, at a later time, access the memory device to retrieve and process the stored output.
- Hardware-implemented modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
- the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
- the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).
- SaaS software as a service
- Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
- Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
- a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment.
- a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
- operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output.
- Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry, e.g., a FPGA or an ASIC.
- the computing system can include clients and servers.
- a client and server are generally remote from each other and typically interact through a communication network.
- the relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
- both hardware and software architectures merit consideration.
- the choice of whether to implement certain functionality in permanently configured hardware e.g., an ASIC
- temporarily configured hardware e.g., a combination of software and a programmable processor
- a combination of permanently and temporarily configured hardware may be a design choice.
- hardware e.g., machine
- software architectures that may be deployed, in various example embodiments.
- FIG. 15 is a block diagram of machine in the example form of a computer system 1500 within which instructions 1524 may be executed for causing the machine to perform any one or more of the methodologies discussed herein.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- STB set-top box
- WPA personal digital assistant
- cellular telephone a cellular telephone
- web appliance a web appliance
- network router switch or bridge
- machine any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
- the example computer system 1500 includes a processor 1502 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1504 , and a static memory 1506 , which communicate with each other via a bus 1508 .
- the computer system 1500 may further include a video display unit 1510 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
- the computer system 1500 also includes an alphanumeric input device 1512 (e.g., a keyboard or a touch-sensitive display screen), a user interface (UI) navigation (or cursor control) device 1514 (e.g., a mouse), a disk drive unit 1516 , a signal generation device 1518 (e.g., a speaker), and a network interface device 1520 .
- an alphanumeric input device 1512 e.g., a keyboard or a touch-sensitive display screen
- UI user interface
- cursor control device 1514 e.g., a mouse
- disk drive unit 1516 e.g., a disk drive unit 1516
- signal generation device 1518 e.g., a speaker
- network interface device 1520 e.g., a network interface device 1520 .
- the disk drive unit 1516 includes a machine-readable medium 1522 on which is stored one or more sets of data structures and instructions 1524 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein.
- the instructions 1524 may also reside, completely or at least partially, within the main memory 1504 and/or within the processor 1502 during execution thereof by the computer system 1500 , with the main memory 1504 and the processor 1502 also constituting machine-readable media 1522 .
- machine-readable medium 1522 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 1524 or data structures.
- the term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions 1524 for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions 1524 .
- the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media.
- machine-readable media 1522 include non-volatile memory, including by way of example semiconductor memory devices, e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
- semiconductor memory devices e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory devices e.g., electrically erasable programmable read-only memory (EEPROM), and flash memory devices
- magnetic disks such as internal hard disks and removable disks
- magneto-optical disks e.g., magneto-optical disks
- the instructions 1524 may further be transmitted or received over a communications network 1526 using a transmission medium.
- the instructions 1524 may be transmitted using the network interface device 1520 and any one of a number of well-known transfer protocols (e.g., HTTP).
- Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMax networks).
- POTS plain old telephone
- wireless data networks e.g., WiFi and WiMax networks.
- transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions 1524 for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
- inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
- inventive concept merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
Abstract
Description
- This document generally relates to methods and systems for use with computer networks. More particularly, this document relates to an extensibility framework for use with dynamic computer programming languages.
- Enterprise mobility platforms manage the whole life cycle of applications for an enterprise, including application development, application deployment, application execution, application management and mobile device management. In a large corporation with tens of thousands of employees, multiple business lines, and millions of transactions daily, employees need to access data and work on their tasks while they are in different statuses. Enterprise mobility platforms enable business analysts and developers to quickly develop mobile applications with specific business objectives and functionality and deploy the applications, allowing other employees to use the applications on their devices to process data and information.
- The present disclosure is illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
-
FIG. 1 is a block diagram illustrating an enterprise mobility platform system in accordance with an example embodiment. -
FIG. 2 is a screen capture illustrating a screen design page of an application development tool in accordance with an example embodiment. -
FIG. 3 is a screen capture illustrating the code preview page of an application development tool in accordance with an example embodiment. -
FIG. 4 is a screen capture illustrating the screen design page of an application development tool after the start screen has been edited in accordance with an example embodiment. -
FIG. 5 is a screen capture illustrating the code preview page of an application development tool after the start screen has been edited in accordance with an example embodiment. -
FIG. 6 is a screen capture illustrating the code preview page of an application development tool after the code has been manually edited in accordance with an example embodiment. -
FIG. 7 is a screen capture illustrating the screen design page of an application development tool in light of the manual update of the source code in accordance with an example embodiment. -
FIG. 8 is a screen capture illustrating the screen design page of an application development tool after the label has been removed. -
FIG. 9 is a screen capture illustrating the code preview page of an application development tool after the label has been removed. -
FIG. 10 is a screen capture illustrating the code preview page of an application development tool after the OK button has been removed. -
FIG. 11 is a screen capture illustrating the screen design page of an application development tool in light of the manual update of the source code to remove the OK button in accordance with an example embodiment. -
FIG. 12 is a block diagram illustrating an application development tool in accordance with an example embodiment. -
FIG. 13 is a flow diagram illustrating a method of altering a hybrid application in an application development tool in accordance with an example embodiment. -
FIG. 14 is a block diagram illustrating a mobile device, according to an example embodiment. -
FIG. 15 is a block diagram of machine in the example form of a computer system within which instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed. - The description that follows includes illustrative systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques have not been shown in detail.
- In an example embodiment, aspects are provided that allow for bi-directional editing between a visual screen designer in an enterprise mobility platform and source code. This helps reduce the difficulty a programmer typically has in manually writing source code and viewing the effects of changes in the source code.
-
FIG. 1 is a block diagram illustrating an enterprisemobility platform system 100 in accordance with an example embodiment. The enterprisemobility platform system 100 may include anapplication development tool 102, ahybrid application 104, aserver 106, and ahybrid web container 108. The application development tool 102 (also sometimes known as a designer) provides a visual graphical interface to allow a programmer to develop thehybrid application 104. Once complete, thehybrid application 104 is deployed to theserver 106. Then aclient device 110 downloads thehybrid application 104 in thehybrid web container 108. Pictured here is a singlehybrid web container 108 on aclient device 110, although one of ordinary skill in the art will recognize that multiple client devices could each have their ownhybrid web container 108 for aparticular hybrid application 104. Thehybrid web container 108 is a native mobile application that connects to theserver 106 to download theappropriate hybrid application 104 and then executes it. A back-end database 112 is also provided to allow thehybrid application 104 to access shared data. - One example of an enterprise mobility platform system is Sybase Unwired Platform (SUP). In SUP, the
application development tool 102 is an Eclipse-based integrated development environment (IDE) that creates one or more mobile workflow packages. A mobile workflow package is a model for screens, widgets on screens and workflows between screens. A XBW model is an implementation of a mobile workflow package that defines user interface (screens) and workflow (from screen to screen) of a hybrid application. XBW is not an acronym but rather the name applied to this type of model. The developer is able to visually design screens and workflow using drag-and-drop operations. - The
hybrid application 104 may store mobile business objects (MBOs) and workflow packages. An MBO is a software object like a class in programming languages. It encapsulates business logic of a mobile application. The MBO also includes attributes that map to data in a data set. When a mobile application requests data, mobile applications use MBOs to retrieve data from the data set. The MBO is developed in theapplication development tool 102 and deployed to theserver 106 along with the application. The MBO is bound to a data source, such as a data source in the back-end database 112. - Mobile workflow packages include web files that enable a hybrid application to execute on mobile devices having different operating platforms. These web files enable a hybrid application to execute on mobile devices having different operating platforms. Additionally, the web files can be used to create mobile applications that include extensive customization at the screen layout and data interaction.
- When the
server 106 receives a workflow package, it may deploy thehybrid application 104 as web files to theclient device 110. On theclient device 110, the web files may be installed in thehybrid web container 108. - When a
client device 110 receives data from theserver 106, a user may manipulate the data. After a user manipulates the data, theclient device 110 may synchronize with theserver 106 by returning the manipulated data to theserver 106. On theserver 106, the manipulated data may be stored as a data subset. Theserver 106 then uses the appropriate MBO to store and retrieve data from the data source (e.g., back-end database 112). When theserver 106 retrieves data from the data set, it may store the data as a data subset. Periodically, theserver 106 may synchronize with the data source by updating the data set with the manipulated data stored in the data subset. - A developer is able to design screens one by one in a screen design page and generate JavaScript or other source code from whatever has been designed. The JavaScript may include methods that execute business logic of a mobile application. The methods included in the JavaScript may be triggered when a user selects a link or button included in a screen or performs some other interaction with a screen of a mobile application. Additionally, the JavaScript allows a developer to include functionality that activates other features or applications native to the
client device 110. A person skilled in the art will appreciate that a native application is an application that is designed to run on an operating platform theclient device 110. For example, a developer may add methods to a JavaScript file to activate the camera or a calculator on a mobile device. -
FIG. 2 is a screen capture illustrating ascreen design page 200 of anapplication development tool 102 in accordance with an example embodiment. Thescreen design page 200 is enabled by the user pressing on ascreen design tab 202. Here thescreen design page 200 is displaying astart screen 204 along with amenu 206 for thestart screen 204 andcustom actions 208 for thestart screen 204. Typically, once the developer is finished visually designing the screens using thescreen design page 200, he or she would generate JavaScript or other source code from whatever has been designed. While the designer could manually edit the generated JavaScript or other source code prior to deploying the hybrid application to theserver 106, the manual changes would not iterate through to the mobile workflow package (e.g., XBW model) that defines the visually created screens from thescreen design page 200. Hence, ordinarily the developer would be unable to go back to alter the visual design of the screens once a manual change is made to the source code. - In an example embodiment, source code is generated in a framework that deals with user interface rendering. This allows a rendered screen to be modified if the underlying source code is also modified. In such an embodiment, a developer enjoys the convenience of visually creating and editing screens using a
screen design page 200 of anapplication development tool 102, but can also expect fine-grained control over the user interface by manually editing the source code and seeing the results of such manual edits in thescreen design page 200. In one example embodiment, the source code is generated in UI5, which is a JavaScript framework that deals with UI rendering and many other tasks. Thus, in an example embodiment, theapplication development tool 102 is designed to allow bidirectional editing between a SUP XBW model and UI5 JavaScript source code. -
FIGS. 3-11 (in conjunction with previously discussedFIG. 2 ) depict an example of bi-directional editing between a visual screen designer and source code in accordance with an example embodiment. Beginning withFIG. 3 , the developer may view the source code corresponding to the visually designedstart screen 204 by pressing acode preview tab 210 in theapplication development tool 102.FIG. 3 is a screen capture illustrating thecode preview page 300 of anapplication development tool 102 in accordance with an example embodiment. Thecode 302 depicted in thecode preview page 300 represents JavaScript code generated from the mobile workflow package (e.g., XBW model) from thestart screen 204 that was previously designed by the developer. - Assume then that the developer navigates back to the
screen design page 200 by pressing thescreen design tab 202 and then edits thestart screen 204 using a drag and drop operation.FIG. 4 is a screen capture illustrating thescreen design page 200 of anapplication development tool 102 after thestart screen 204 has been edited in accordance with an example embodiment. Here, the developer has dragged and dropped anOK button 400 into thestart screen 204. - Assume then that the developer navigates back to the
code preview page 300 by pressing thecode preview tab 210 in theapplication development tool 102.FIG. 5 is a screen capture illustrating thecode preview page 300 of anapplication development tool 102 after thestart screen 204 has been edited in accordance with an example embodiment. As can be seen, thecode 500 has been updated to includecode 502 for the newly addedOK button 400. - Then assume that the developer wishes to manually alter the
code 500 from within thecode preview page 300.FIG. 6 is a screen capture illustrating thecode preview page 300 of anapplication development tool 102 after thecode 500 has been manually edited in accordance with an example embodiment. Here, the developer has addedadditional code 600, which is intended to display a new label in thestart screen 204. Then assume that the developer wishes to see the visual results of this code change and presses thescreen design tab 202.FIG. 7 is a screen capture illustrating thescreen design page 200 of anapplication development tool 102 in light of the manual update of the source code in accordance with an example embodiment. Here, thescreen design page 200 has been updated to reflect thenew label 700. - Then assume that the developer wishes to remove
label 702. The developer can then visually remove the label 702 (such as by a drag and drop operation) in thescreen design page 200.FIG. 8 is a screen capture illustrating thescreen design page 200 of anapplication development tool 102 after thelabel 702 has been removed. As can be seen, thestart screen 204 has been updated so thatOK button 400 and thenew label 700 have been moved. - Then assume the developer wishes to view the source code after the
label 702 has been removed.FIG. 9 is a screen capture illustrating thecode preview page 300 of anapplication development tool 102 after thelabel 702 has been removed. As can be seen,code 900 has been automatically altered to remove the code relating to thelabel 702. - Then assume the developer has changed his or her mind and wishes to remove the
OK button 400. The developer can do this from thecode preview page 300 by manually deleting the code corresponding to theOK button 400.FIG. 10 is a screen capture illustrating thecode preview page 300 of anapplication development tool 102 after theOK button 400 has been removed. As can be seen, thecode 1000 has been manually updated by the developer to remove code relating to theOK button 400. - Then assume that the developer wishes to view the result of the removal of the
OK button 400. The developer can then again press thescreen design tab 202.FIG. 11 is a screen capture illustrating thescreen design page 200 of anapplication development tool 102 in light of the manual update of the source code to remove theOK button 400 in accordance with an example embodiment. Here thestart screen 204 has been updated to reflect the changes made to the source code. - Of course, the above are only examples of the many different types of editing that can be performed in each of the
screen design page 200 and thecode preview page 300. Properties of the widgets in either page can also be modified, and the corresponding changes can be seen in the other page. Similar bi-directional editing can be performed to add or remove transitions between elements or screens. -
FIG. 12 is a block diagram illustrating anapplication development tool 1200 in accordance with an example embodiment. In an example embodiment, theapplication development tool 1200 is theapplication development tool 102 ofFIG. 1 in more detail. Theapplication development tool 1200 may include ascreen design tool 1202 and acode preview tool 1204. Thescreen design tool 1202 may present a visual interface to a user to visually edit screens of a hybrid application, such as thescreen design page 200 depicted inFIGS. 2-11 above. Thecode preview tool 1204 may present raw source code in a screen to a user to visually edit the raw source code of the hybrid application, such as thecode preview page 300 depicted inFIGS. 3-11 above. Abi-directional conversion module 1206 may detect when changes are made in either thescreen design tool 1202 or thecode preview tool 1204 and may convert and iterate such changes through to the other of thescreen design tool 1202 orcode preview tool 1204. This may be accomplished by accessing a library ofelements 1208. The library ofelements 1208 may contain identifications of a number of supported elements that a developer can add to a hybrid application as well as corresponding code-generating algorithm for each of those supported elements. The library ofelements 1208 may also contain algorithms to update the mobile workflow package for each of the supported elements. When a change occurs in the screen design tool 1002, thebi-directional conversion module 1206 may act to convert that change to the corresponding source code. This may include identifying the screen element being altered (e.g., updated, deleted or added) and obtaining the corresponding code-generating algorithm for that element from the library ofelements 1208. The source code can then be modified by applying that element's code-generating algorithm. Likewise, if a developer makes a change in thecode preview tool 1204, such as changing, adding or deleting a particular block of code, thebi-directional conversion module 1206 may act to convert that change to the corresponding visual element. This may include identifying the element corresponding to that block of source code in the library ofelements 1208 and then retrieving the algorithms to update the mobile workflow package for that identified element. The algorithms are applied to update the mobile workflow package. Then, thescreen design tool 1202 displays an updated or newly added element in the, or in the case of a deleted element, stops displaying the identified element. -
FIG. 13 is a flow diagram illustrating amethod 1300 of altering a hybrid application in an application development tool in accordance with an example embodiment. In an example embodiment, themethod 1300 may be performed in a bi-directional conversion module, such as thebi-directional conversion module 1206 ofFIG. 12 . Atoperation 1302, a change in the application development tool (e.g.,application development tool 102 ofFIG. 1 orapplication development tool 1200 ofFIG. 12 ) may be detected. This change may be a change created by a developer using the application development tool. Examples of such changes include alterations of screens in a screen design tool (e.g., screen design tool 1202) by, for example, dragging and dropping new elements onto a screen of the hybrid application, removing elements from the screen of the hybrid application, editing elements' properties, and creating or removing transitions between screens in the hybrid application. Additional examples of such changes include adding, removing, or editing source code, in a code preview tool (e.g., code preview tool 1204), written in a scripting language that can be rendered as a user interface representation. - At
operation 1304 it is determined whether the change is in a screen design tool or a page preview tool. If it is determined that the change is in the screen design tool, then atoperation 1306 the changed element is identified. Atoperation 1308, a library of elements (e.g., library of elements 1208) is accessed to retrieve a code-generating algorithm corresponding to the changed element. Thus, if a particular element is added, a code-generating algorithm for that element is obtained from the library of elements. Likewise, if a particular element is edited, a code-generating algorithm for that element is obtained from the library of elements. Likewise, if a particular element is deleted, a code-generating algorithm for that element is obtained from the library of elements. Atoperation 1310, a corresponding change is made to the source code for the edited screen. This may include accessing the code-generating algorithm corresponding to the element being changed in the screen design tool and adding, editing, or deleting the source code for the element, as appropriate. In the case of an addition of an element, the algorithm corresponding to the element will generate and insert source code for the element and into the screen's source code at an appropriate place. In the case of an editing of an element, the source code corresponding to the prior version of the element may be removed and the code-generating algorithm corresponding to the element will generate and insert source code for the element in its place (or, alternatively, only code relating to the change within the element itself will be changed). In the case of deletion of an element, the source code may be scanned to identify a block of code matching the element, and then that block of code may be removed from the source code. Atoperation 1312 the code preview tool may be caused to display the changed source code. - If it is determined at
operation 1302 that the change is in a code preview tool, then atoperation 1314 the changed source code is identified. This may include identifying a block of source code that has been added to the source code, identifying a block of source code that has been modified, or identifying a block of source code that has been deleted from the source code. Atoperation 1316, a library of elements is accessed to retrieve algorithms to update the mobile workflow package for to the element corresponding to the changed block of source code. Thus, if a particular block of source code is added, algorithms to update the mobile workflow package corresponding to the element that is the subject of the block of source code are obtained from the library of elements. Likewise, if a particular block of source code is modified, algorithms to update the mobile workflow package for the element corresponding to the element that is the subject of the block of source code are obtained. Likewise, if a particular block of source code is deleted, algorithms to update the mobile workflow package corresponding to the element that is the subject of the block of source code are obtained from the library of elements. - At
operation 1318, a corresponding change is made to the mobile workflow package for the source code. This may include locating the element (in the case of editing and deleting) corresponding to the source code being edited in the code preview tool and adding a new element, or editing or deleting the element to update the mobile workflow package for the element, as appropriate. In the case of an addition of an element, the algorithms to update the mobile workflow package corresponding to the element is applied to insert the element into the mobile workflow package at an appropriate place. In the case of a deletion of an element, the mobile workflow package may be scanned to identify algorithms to update the mobile workflow package matching the element, and then those algorithms are applied to remove the element from the mobile workflow package. In the case of the editing of an element, the algorithms corresponding to the element are applied to update the mobile workflow package. Atoperation 1320 the screen design tool may be caused to display the changed screen. -
FIG. 14 is a block diagram illustrating amobile device 1400, according to an example embodiment. Themobile device 1400 may include aprocessor 1402. Theprocessor 1402 may be any of a variety of different types of commerciallyavailable processors 1402 suitable for mobile devices 1400 (for example, an XScale architecture microprocessor, a microprocessor without interlocked pipeline stages (MIPS) architecture processor, or another type of processor 1402). Amemory 1404, such as a random access memory (RAM), a flash memory, or other type of memory, is typically accessible to theprocessor 1402. Thememory 1404 may be adapted to store an operating system (OS) 1406, as well asapplication programs 1408, such as a mobile location enabled application that may provide location-based services to a user. Theprocessor 1402 may be coupled, either directly or via appropriate intermediary hardware, to adisplay 1410 and to one or more input/output (I/O)devices 1412, such as a keypad, a touch panel sensor, a microphone, and the like. Similarly, in some embodiments, theprocessor 1402 may be coupled to atransceiver 1414 that interfaces with anantenna 1416. Thetransceiver 1414 may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via theantenna 1416, depending on the nature of themobile device 1400. Further, in some configurations, aGPS receiver 1418 may also make use of theantenna 1416 to receive GPS signals. - Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied (1) on a non-transitory machine-readable medium or (2) in a transmission signal) or hardware-implemented modules. A hardware-implemented module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) of one or more processors (e.g., processor 1402) may be configured by software (e.g., an application or application portion) as a hardware-implemented module that operates to perform certain operations as described herein.
- In various embodiments, a hardware-implemented module may be implemented mechanically or electronically. For example, a hardware-implemented module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware-implemented module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware-implemented module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- Accordingly, the term “hardware-implemented module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily or transitorily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which hardware-implemented modules are temporarily configured (e.g., programmed), each of the hardware-implemented modules need not be configured or instantiated at any one instance in time. For example, where the hardware-implemented modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware-implemented modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware-implemented module at one instance of time and to constitute a different hardware-implemented module at a different instance of time.
- Hardware-implemented modules can provide information to, and receive information from, other hardware-implemented modules. Accordingly, the described hardware-implemented modules may be regarded as being communicatively coupled. Where multiple of such hardware-implemented modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware-implemented modules). In embodiments in which multiple hardware-implemented modules are configured or instantiated at different times, communications between such hardware-implemented modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware-implemented modules have access. For example, one hardware-implemented module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware-implemented module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware-implemented modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
- The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).
- Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Example embodiments may be implemented using a computer program product, e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers.
- A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
- In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry, e.g., a FPGA or an ASIC.
- The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed, in various example embodiments.
-
FIG. 15 is a block diagram of machine in the example form of acomputer system 1500 within whichinstructions 1524 may be executed for causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. - The
example computer system 1500 includes a processor 1502 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), amain memory 1504, and astatic memory 1506, which communicate with each other via abus 1508. Thecomputer system 1500 may further include a video display unit 1510 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). Thecomputer system 1500 also includes an alphanumeric input device 1512 (e.g., a keyboard or a touch-sensitive display screen), a user interface (UI) navigation (or cursor control) device 1514 (e.g., a mouse), adisk drive unit 1516, a signal generation device 1518 (e.g., a speaker), and anetwork interface device 1520. - The
disk drive unit 1516 includes a machine-readable medium 1522 on which is stored one or more sets of data structures and instructions 1524 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. Theinstructions 1524 may also reside, completely or at least partially, within themain memory 1504 and/or within theprocessor 1502 during execution thereof by thecomputer system 1500, with themain memory 1504 and theprocessor 1502 also constituting machine-readable media 1522. - While the machine-
readable medium 1522 is shown in an example embodiment to be a single medium, the term “machine-readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one ormore instructions 1524 or data structures. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carryinginstructions 1524 for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated withsuch instructions 1524. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media 1522 include non-volatile memory, including by way of example semiconductor memory devices, e.g., erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. - The
instructions 1524 may further be transmitted or received over acommunications network 1526 using a transmission medium. Theinstructions 1524 may be transmitted using thenetwork interface device 1520 and any one of a number of well-known transfer protocols (e.g., HTTP). Examples of communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile telephone networks, plain old telephone (POTS) networks, and wireless data networks (e.g., WiFi and WiMax networks). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carryinginstructions 1524 for execution by the machine, and includes digital or analog communications signals or other intangible media to facilitate communication of such software. - Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
- Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/566,537 US9348563B1 (en) | 2014-12-10 | 2014-12-10 | Bi-directional editing between visual screen designer and source code |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/566,537 US9348563B1 (en) | 2014-12-10 | 2014-12-10 | Bi-directional editing between visual screen designer and source code |
Publications (2)
Publication Number | Publication Date |
---|---|
US9348563B1 US9348563B1 (en) | 2016-05-24 |
US20160170720A1 true US20160170720A1 (en) | 2016-06-16 |
Family
ID=55969682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/566,537 Active US9348563B1 (en) | 2014-12-10 | 2014-12-10 | Bi-directional editing between visual screen designer and source code |
Country Status (1)
Country | Link |
---|---|
US (1) | US9348563B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200004518A1 (en) * | 2018-06-28 | 2020-01-02 | Atlassian Pty Ltd | Systems and methods for tracking source code deployments |
US20200004511A1 (en) * | 2018-06-29 | 2020-01-02 | Mastercard International Incorporated | System and computer-implemented method for bidirectional translation between diagramming and implementation tools |
JP2021111362A (en) * | 2019-12-30 | 2021-08-02 | 株式会社 ビーアイマトリックスBi Matrix Co., Ltd | Business automation system using object action on excel sheet |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016144546A2 (en) * | 2015-03-06 | 2016-09-15 | Saggezza Inc. | Systems and methods for generating data visualization applications |
US10101976B2 (en) * | 2015-04-30 | 2018-10-16 | OpenMethods, Inc. | Method, system and apparatus for visual programming of interaction workflows for omni-channel customer contact centers with integrated customer relationship management |
US10157047B2 (en) * | 2015-10-19 | 2018-12-18 | Facebook, Inc. | Methods and systems for communicating application prototype information |
US9870062B2 (en) | 2015-11-11 | 2018-01-16 | Facebook, Inc. | Methods and systems for defining gestures for a user interface |
US10248300B2 (en) | 2016-05-16 | 2019-04-02 | Sap Se | Polymorph rendering for collaborative platforms |
US10726036B2 (en) | 2016-05-16 | 2020-07-28 | Sap Se | Source service mapping for collaborative platforms |
US11295273B2 (en) | 2016-05-16 | 2022-04-05 | Sap Se | Normalized object exposure for collaborative platforms |
US10078818B2 (en) * | 2016-12-20 | 2018-09-18 | Sap Se | Work routine management for collaborative platforms |
CN108259435B (en) * | 2016-12-29 | 2021-02-26 | 中国移动通信集团浙江有限公司 | Method and device for realizing hybrid application of access Web component |
CN107291461A (en) * | 2017-06-13 | 2017-10-24 | 中国五冶集团有限公司 | For the intelligentized Software Development Platform of building field |
US10831538B2 (en) * | 2018-10-29 | 2020-11-10 | Sap Se | Optimized management of application resources for mobile applications |
US10936307B2 (en) * | 2018-11-26 | 2021-03-02 | International Business Machines Corporation | Highlight source code changes in user interface |
US10754626B2 (en) | 2018-11-30 | 2020-08-25 | Shopify Inc. | Visual and code views in a process workflow user interface |
US20210055915A1 (en) * | 2019-08-23 | 2021-02-25 | Google Llc | No-coding machine learning pipeline |
CN112667218A (en) * | 2020-12-22 | 2021-04-16 | 北京锐安科技有限公司 | Processing method, device, equipment and storage medium |
CN117573127B (en) * | 2024-01-17 | 2024-04-23 | 中建三局信息科技有限公司 | Page building method and device, electronic equipment and medium |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7370315B1 (en) * | 2000-11-21 | 2008-05-06 | Microsoft Corporation | Visual programming environment providing synchronization between source code and graphical component objects |
US7681176B2 (en) * | 2005-03-04 | 2010-03-16 | Microsoft Corporation | Generating a graphical designer application for developing graphical models |
JP2007034571A (en) * | 2005-07-26 | 2007-02-08 | Fujitsu Ltd | Document processing program and document processing method |
US9122762B2 (en) * | 2007-07-18 | 2015-09-01 | Ebay, Inc. | Method and system to maintain a web page |
US8978006B2 (en) * | 2011-04-06 | 2015-03-10 | Media Direct, Inc. | Systems and methods for a mobile business application development and deployment platform |
US8798775B2 (en) * | 2011-06-28 | 2014-08-05 | Rockwell Automation Technologies, Inc. | Binding graphic elements to controller data |
WO2013124858A1 (en) * | 2012-02-26 | 2013-08-29 | Passcall Advanced Technologies (Transforma) Ltd. | Method and system for creating dynamic browser-based user interface by example |
-
2014
- 2014-12-10 US US14/566,537 patent/US9348563B1/en active Active
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200004518A1 (en) * | 2018-06-28 | 2020-01-02 | Atlassian Pty Ltd | Systems and methods for tracking source code deployments |
US10871957B2 (en) * | 2018-06-28 | 2020-12-22 | Atlassian Pty Ltd. | Systems and methods for promoting a source code revision for deployment in a target environment |
US11194565B2 (en) | 2018-06-28 | 2021-12-07 | Atlassian Pty Ltd. | Systems and methods for tracking source code deployments |
US11520574B2 (en) * | 2018-06-28 | 2022-12-06 | Atlassian Pty Ltd. | Systems and methods for promoting a source code revision for deployment in a target environment |
US20200004511A1 (en) * | 2018-06-29 | 2020-01-02 | Mastercard International Incorporated | System and computer-implemented method for bidirectional translation between diagramming and implementation tools |
US10908881B2 (en) * | 2018-06-29 | 2021-02-02 | Mastercard International Incorporated | System and computer-implemented method for bidirectional translation between diagramming and implementation tools |
JP2021111362A (en) * | 2019-12-30 | 2021-08-02 | 株式会社 ビーアイマトリックスBi Matrix Co., Ltd | Business automation system using object action on excel sheet |
JP7070864B2 (en) | 2019-12-30 | 2022-05-18 | 株式会社 ビーアイマトリックス | Business automation system using object actions on Excel sheet |
Also Published As
Publication number | Publication date |
---|---|
US9348563B1 (en) | 2016-05-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9348563B1 (en) | Bi-directional editing between visual screen designer and source code | |
US11797479B2 (en) | Integrating object-based data integration tool with a version control system in centralized and decentralized environments | |
US11966774B2 (en) | Workflow generation using multiple interfaces | |
US10824403B2 (en) | Application builder with automated data objects creation | |
US10127206B2 (en) | Dynamic column groups in excel | |
CN102520841B (en) | Collection user interface | |
EP3726373B1 (en) | Creating an app method and system | |
US9176726B2 (en) | Method and apparatus for developing, distributing and executing applications | |
US11797273B2 (en) | System and method for enhancing component based development models with auto-wiring | |
US20130152038A1 (en) | Project management workflows | |
US10877846B2 (en) | Performing a closure merge operation | |
US20140101635A1 (en) | Automated generation of two-tier mobile applications | |
US20150089334A1 (en) | Workbook composer | |
US20130151937A1 (en) | Selective image loading in mobile browsers | |
US20160188302A1 (en) | Automatic generation of metadata-based cross-platform mobile applications | |
US20170300319A1 (en) | Automatic submission of applications to applications stores | |
US20140143752A1 (en) | Systems and methods for providing environments as a service | |
TW201606547A (en) | Subscriber defined dynamic eventing | |
US20240160418A1 (en) | Low code no code ci/cd platform | |
JP2016040643A (en) | Image data management system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAP SE, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:XUE, YUNJIAO;BRANDOW, DAVID;REEL/FRAME:034467/0640 Effective date: 20141210 |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |