CN114168128A - Method for generating responsive page, graphical user interface and electronic equipment - Google Patents
Method for generating responsive page, graphical user interface and electronic equipment Download PDFInfo
- Publication number
- CN114168128A CN114168128A CN202010949363.5A CN202010949363A CN114168128A CN 114168128 A CN114168128 A CN 114168128A CN 202010949363 A CN202010949363 A CN 202010949363A CN 114168128 A CN114168128 A CN 114168128A
- Authority
- CN
- China
- Prior art keywords
- layout
- interface elements
- interface
- interface element
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000013461 design Methods 0.000 claims abstract description 108
- 239000012634 fragment Substances 0.000 claims abstract description 4
- 230000006978 adaptation Effects 0.000 claims description 60
- 230000008859 change Effects 0.000 claims description 20
- 230000008676 import Effects 0.000 claims description 5
- 230000001105 regulatory effect Effects 0.000 claims 1
- 238000004891 communication Methods 0.000 description 19
- 230000004044 response Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 13
- 238000012545 processing Methods 0.000 description 13
- 238000004364 calculation method Methods 0.000 description 12
- 238000010295 mobile communication Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 238000007726 management method Methods 0.000 description 9
- 230000002452 interceptive effect Effects 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 230000002829 reductive effect Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 230000003044 adaptive effect Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 5
- 210000004027 cell Anatomy 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 230000001965 increasing effect Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 238000003780 insertion Methods 0.000 description 3
- 230000037431 insertion Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000003116 impacting effect Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000008719 thickening Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011282 treatment Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/38—Creation or generation of source code for implementing user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a method for generating a responsive page, which can support a user to use atomic capacity in a combined manner and convert a common design draft into a responsive layout, so that a UI designer can efficiently design a responsive page capable of adapting to various screen resolutions, and can convert the atomic capacity used by the responsive layout and related description parameters of the atomic capacity into code fragments which can be read and used by a developer, thereby saving the programming workload of a software developer.
Description
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method for generating a responsive page, a graphical user interface, and an electronic device.
Background
More and more devices such as mobile phones, tablet computers, car machines, smart screens and the like are added to a hardware ecology, the difference of screen resolutions of different devices is larger and larger, and the screen resolution conditions to be considered by User Interface (UI) designers and software developers are increasing rapidly, which undoubtedly and seriously increases the workload of the UI designers and the software developers.
Disclosure of Invention
The method for generating the responsive page can support a UI designer to efficiently design the responsive page capable of adapting to various screen resolutions, and can output code segments which can be read and used by a software developer so as to save the programming workload of the software developer.
In a first aspect, a method for generating a responsive page is provided, and the method may include: the electronic device detects an operation of importing the design draft. The electronic device may display the interface elements drawn in the design draft in a first page layout, the first page layout being described by the design draft. Thereafter, the electronic device may detect an operation of selecting a first layout adaptation rule for the first interface element, and the electronic device may assign the first layout adaptation rule to the first interface element. The first layout adaptation rule may be used to adjust one or more of: the position, size, and layout of the first interface element. The electronic device may detect an operation of selecting the second screen size, and in response to the operation, the electronic device may display the interface elements drawn in the design draft in the second page layout. The layout of the first interface element in the second page layout is determined according to the parameters describing the first layout adaptation rules. The second page layout is different from the first page layout, and the second screen size is different from the first screen size.
In the first aspect, the operation of importing the design may be, for example, an operation of importing the design by selecting an import file of an insertion option in main menu 101 as shown in fig. 3A, or may be, for example, an operation of dragging the design into canvas 104 as shown in fig. 3B.
In the first aspect, the first page layout is a layout designed in a design draft. As shown in fig. 4B, the electronic device can display interface elements drawn in the design in the canvas 104, which can be displayed in the canvas 104 following the design. The interface elements are displayed on the canvas 104 in conformity with the design draft means that the style (pattern, color, shape, etc.) of the interface elements conforms to the design draft on the canvas 104, and the layout of the interface elements conforms to the design draft.
In the first aspect, the first interface element may be one or a set of interface elements drawn in the design selected by the user. For example, as shown in FIG. 5B, the user may select (e.g., frame select) interface elements 1D1, 1D121, 1D122, 1D123, 1D124, 1D125 at once. The first interface element may be interface elements 1D1, 1D121, 1D122, 1D123, 1D124, 1D 125.
In the first aspect, the first layout adaptation rule may be one layout adaptation rule (i.e., atomic capability) selected by the user. For example, as shown in fig. 6A, upon detecting a user selection of a set of interface elements 1D121, 1D122, 1D123, 1D124, 1D125, the electronic device may further detect an operation of the user assigning "hidden capabilities" to the selected set of interface elements. The first layout adaptation rule may be a "hiding capability", which may also be referred to as a "hiding" rule.
In the first aspect, the second screen size may be a screen size selected by a user. The operation of selecting the second screen size is an operation of viewing the responsive layout in the second screen size. For example, as shown in fig. 7A, upon detecting that the user selects the option "tablet 1", the electronic device may display a screen size (i.e., screen resolution) of "tablet 1", such as "long: 2560px width: 1440px ". The second screen size may be the screen size of "tablet 1".
In the first aspect, the second page layout may be determined according to the atomic capability (i.e., layout adaptation rule) assigned by the user to the interface element in the design draft and the parameter for describing the atomic capability.
It can be seen that the method provided by the first aspect can support a UI designer to simply and efficiently deconstruct a design draft and allocate atomic capability to interface elements drawn in the design draft, so as to efficiently convert the design draft into a responsive layout. And moreover, the UI designer is supported to check the adaptation conditions of the responsive layout under different screen sizes, the method is visual and efficient, the UI designer can modify the atomic capability timely and effectively, the responsive layout is continuously perfected, and the responsive layout capable of adapting to the multi-screen sizes can be simply, conveniently and quickly obtained.
With reference to the first aspect, in some embodiments, the first layout adaptation rule may include one or more of: the first type of layout adaptation rules are used for adjusting the position and the size of the first interface element; the second type of layout adaptation rules are used to adjust the layout of the first interface element. The first type of layout adaptation rules may also be referred to as adaptive change capability, and when the size of a page changes, the size and position of the interface elements themselves may be changed to adapt to the change of the page. A second type of layout adaptation rule may be referred to as an adaptive layout capability, in which the internal layout of the interface elements may be changed to accommodate changes in the page as the page size changes.
In combination with the first aspect, in some embodiments, the first interface element may be one or more interface elements.
With reference to the first aspect, in some embodiments, the first layout adaptation rule may be one or more layout adaptation rules.
With reference to the first aspect, in some embodiments, after the electronic device detects an operation of importing the design, the electronic device may further display a plurality of layout adaptation rules, where the plurality of layout adaptation rules includes the first layout adaptation rule. The plurality of layout adaptation rules may be displayed in an atomic capability list 103 as shown in fig. 3A-3B.
With reference to the first aspect, in some embodiments, if the first interface element is a plurality of interface elements and the first layout adaptation rule is a plurality of layout adaptation rules, then: as shown in fig. 17, the electronic device may classify the description parameters for the plurality of layout adaptation rules into the following categories of parameters: the parameter used for determining the size of the interface elements, the parameter used for determining the number of the interface elements and the parameter used for determining the layout of the interface elements. Then, under the condition that the size of the first interface element and the number of interface elements that can be displayed in the container where the first interface element is located are determined, the electronic device may determine the layout of the first interface element according to the parameter for determining the layout of the interface element. The size of the first interface element is determined according to the parameter for determining the size of the interface element, and the parameter for determining the number of the interface elements is determined according to the parameter for determining the number of the interface elements.
With reference to the first aspect, in some embodiments, after the electronic device detects an operation of importing a design, the electronic device may further display a first list, where interface elements drawn in the design in the first list are grouped by a layer. The first list may be the interface element list 102 shown in fig. 3A-3B.
In connection with the first aspect, in some embodiments, the electronic device may detect an operation of the user to export a code segment, for example, enter the "file" option in the main menu 101, click on the secondary menu "save as", and select an operation to save the responsive layout shown in the menu 104 as a code. In response to this operation, the electronic device may convert the first layout adaptation rule assigned to the first interface element and the description first layout adaptation rule into a code fragment, as shown in fig. 9.
Finally, the electronic device may output the code snippet. In this way, the programming effort of the software developer can be saved.
In combination with the first aspect, in some embodiments, the first layout adaptation rule includes one or more of the following (which may in turn become atomic capabilities): the system comprises a scaling rule, a stretching rule, a hiding rule, a folding rule, an even distribution rule, an occupation ratio rule and an extension rule. Wherein,
the scaling rule may be used to specify that the scale of the interface element is unchanged, and to scale the interface element to fit the dimensional change of the container of interface elements;
the stretching rules may be used to specify that the interface element is stretched in a user-selected stretching direction to adapt to a dimensional change of the container of interface elements;
the hiding rules may be used to specify display priorities for a set of interface elements to accommodate changes in size of containers for the set of interface elements;
the folding rule may be used to specify that a group of interface elements is to be adjusted from a single-row arrangement to a double-row arrangement or a multi-row arrangement, or that a group of interface elements is to be adjusted from a double-row arrangement or a multi-row arrangement to a single-row arrangement, depending on the available space of the container for the group of interface elements;
the equipartition rules may be used to specify that the spacing between a set of interface elements remains consistent;
the occupancy rule may be used to specify a space occupancy for a set of interface elements within a container for the set of interface elements;
an extension rule may be used to specify that the number of interface elements that can be displayed inside a container for a set of interface elements varies according to the spatial size of the container.
In a second aspect, an electronic device is provided, which comprises a plurality of functional units for performing the method provided in any one of the possible implementation manners of the first aspect.
In a third aspect, an electronic device is provided, which is operable to perform the method described in the first aspect. The electronic device may include: the bluetooth communication processing module, the display screen, the processor and the memory, wherein the memory stores one or more programs, and the processor invokes the one or more programs to cause the computer to execute the method described in the first aspect.
In a fourth aspect, there is provided a computer-readable storage medium having stored thereon instructions which, when run on a computer, cause the computer to perform the method described in the first aspect above.
In a fifth aspect, there is provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the method described in the first aspect above.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the background art of the present application, the drawings required to be used in the embodiments or the background art of the present application will be described below.
FIG. 1 shows screen resolutions for different manufacturers and models of equipment on the market;
fig. 2 shows a simplified flow of a method for generating a responsive page provided by an embodiment of the present application.
Fig. 3A-3B illustrate user interfaces provided by embodiments of the present application for designing responsive pages.
Fig. 4A exemplarily shows a layer structure of the design draft.
FIG. 4B illustrates the interactive functionality of deconstructed design provided by the user interface.
FIGS. 5A-5B, 6A-6B illustrate interactive functions provided by a user interface to select interface elements, assign atomic capabilities.
FIGS. 7A-7C, 8A-8B illustrate interactive functionality provided by a user interface to view responsive layouts at different screen sizes.
FIG. 9 illustrates the conversion of a responsive page into code.
Fig. 10A simply illustrates the zoom capability.
FIG. 10B illustrates an example of an interface element using zoom capability.
Fig. 11A simply shows the stretching ability.
FIG. 11B illustrates an example of an interface element using stretch capability.
Fig. 12A simply shows the hiding capability.
FIG. 12B illustrates one example of interface element use hiding capability.
FIG. 12C illustrates another example of interface element use hiding capability.
Fig. 13A simply illustrates the fold capability.
FIG. 13B illustrates one example of an interface element using the fold-down capability.
FIG. 13C illustrates another example of an interface element using a fold-over capability.
Fig. 14A simply shows the equipartition capability.
FIG. 14B illustrates one example of an interface element using an equal share capability.
Fig. 15A simply shows the duty ratio capability.
FIG. 15B illustrates one example of interface element usage capture capability.
Fig. 16A simply shows the extension capability.
FIG. 16B illustrates one example of interface element usage extension capability.
FIG. 17 shows a flow of a responsive layout computation engine.
18A-18B illustrate two examples of atomic capability combination, nesting;
fig. 19 shows a structure of an electronic device provided in an embodiment of the present application.
Detailed Description
At present, screen resolutions of different manufacturers and different models of equipment on the market are generally different. As exemplarily shown in fig. 1, the UI designer needs to divide the multiple resolutions to obtain several basic sizes, such as a cell phone screen, a flat vertical screen, a flat horizontal screen, a desktop display, a large screen, an unfolded folding screen, a folded folding screen, and so on.
Even if there are great differences between these several basic sizes, even if there is one page of a certain application program, the UI designer needs to perform page design for these several basic sizes, respectively, and output multiple sets of design manuscripts. The UI designer may then deliver the multiple sets of design to the UI software developer, who in turn programs each of the multiple sets of design to generate a different page. The mode has large workload, and the repeated design work and programming work of the machine greatly reduce the page development efficiency. Moreover, when a resolution which has not been considered before appears, the original page is often poorly adapted. In this regard, UI designers and UI software developers need to redesign and develop pages, and this method is no longer applicable in an era where screen resolutions are more varied.
The embodiment of the application provides a method for generating a responsive page, which can support a user to use atomic capacity in a combined manner and convert a common design draft into a responsive layout, so that a UI designer can efficiently design a responsive page capable of adapting to various screen resolutions, and can convert the atomic capacity used by the responsive layout and related description parameters of the atomic capacity into code fragments which can be read and used by a developer, thereby saving the programming workload of a software developer.
Fig. 2 shows a simplified flow of a method for generating a responsive page provided by an embodiment of the present application. The method can be mainly operated on electronic equipment used by UI designers such as personal computers, tablet computers, folding screen mobile phones and the like. As shown in fig. 2, the electronic device executes the method provided by the embodiment of the present application, mainly changing one input into two outputs. The one input is referred to as a design draft and the two outputs are referred to as a responsive page and a code snippet. The main process flow may include: firstly, a design draft is obtained, the design draft is deconstructed, and each interface element drawn in the design draft is obtained. Then, according to the operation of assigning the atomic capability by the user-selected design module, the design module is assigned the atomic capability, wherein one design module may be a group of interface elements, for example, the group of interface elements 1D21, 1D22, 1D23, 1D24, 1D25 in fig. 5A may be referred to as a "play control" module later. And finally, converting the design draft into a responsive page through a responsive layout calculation engine, and outputting the responsive page. And converting the atomic capability used by the responsive page and the related parameters describing the atomic capability into code segments by a capability parameter derivation module, and outputting the code segments.
In the present application, the atomic capability, such as a scaling capability, a stretching capability, a hiding capability, a line capability, an averaging capability, a proportion capability, an extension capability, and the like, may refer to a layout adaptation rule. Layout adaptation rules may specify how an interface element or group of interface elements respond to screen size changes by adjusting position, size, internal layout, etc. An atomic capability may be associated with a set of parameters that may be used to describe the atomic capability. For example, one set of parameters for the stretch capability may include: stretching direction, fixed margins, etc., wherein stretching direction may be used to specify whether stretching is in the transverse direction or in the longitudinal direction, and fixed margins may be used to specify what upper, lower, left, and right margins are between the interface element being stretched and its container. These typical atomic capabilities are described in detail later and are not expanded herein.
In the application, the responsive page refers to a page with different layouts under different screen sizes, can integrate various screen resolutions from a desktop display to a mobile phone screen and the like, and adapts to screens with different resolutions from small to large (from the present to the super large) by using a programming technology. That is, for various screen sizes, only one set of responsive pages need be developed to adapt without respectively developing multiple sets of pages. Currently, development of responsive pages is generally implemented by code, for example, media query (media query) technology using Cascading Style Sheets (CSSs) loads different layout description codes according to several main stream screen sizes. In this way, the responsive page is heavily coded, which means that UI software developers are heavily burdened and cannot cope with unknown screen sizes. However, the method provided by the application can reduce the programming workload, improve the UI development efficiency and adapt to wider screen sizes more efficiently.
The user interface provided by the present application for designing a responsive page is described below with reference to fig. 3A-3B, fig. 4A-4B, fig. 5A-5B, fig. 6A-6B, fig. 7A-7C, and fig. 8A-8B.
Fig. 3A-3B illustrate user interfaces provided by embodiments of the present application for designing responsive pages.
As shown in fig. 3A-3B, an electronic device such as a personal computer or a tablet computer may display a user interface, which may include: a main menu 101, an interface element list 102, an atomic capability list 103, and a canvas 104. Wherein,
The interface element list 102 may be used to present various interface elements drawn in the draft, such as text, icons, buttons, tabs, labels, progress bars, containers, and the like. The interface element list 102 may specifically show mapping relationships between each layer and interface elements in the design, and refer to fig. 4A to fig. 4B described later.
The list 103 of atomic capabilities shows typical atomic capabilities such as zoom capability, stretch capability, hide capability, this line capability, average capability, share capability, and extend capability. An atomic capability may be associated with a set of parameters that may be used to describe the atomic capability. For example, one set of parameters for the stretch capability may include: stretching direction, fixed margins, etc., wherein stretching direction may be used to specify whether stretching is in the transverse direction or in the longitudinal direction, and fixed margins may be used to specify what upper, lower, left, and right margins are between the interface element being stretched and its container. These typical atomic capabilities will be described in detail later, and are not expanded herein.
The layout of the page in canvas 104 that may be used to present the design, may typically be presented at a default design size.
As shown in fig. 3A-3B, the electronic device may detect an operation of importing a design by the user, for example, an operation of selecting an import file of an insertion option in main menu 101 to import the design as shown in fig. 3A, or an operation of dragging the design into canvas 104 as shown in fig. 3B.
In response to the operation of importing the design draft, the electronic device may analyze the design draft, extract interface elements drawn in the design draft, and map relationships between the interface elements and layers of the design draft. For example, as shown in fig. 4A, assuming that the page drawn by the draft is a music player page, the electronic device may parse out: the following interface elements drawn in the design draft: text input boxes such as "text box 1", "text box 2", "text box 3", "text box 4", and icons such as "previous", "next", "start play", and the like. The electronic device may also parse out: the design is composed of 6 layers, and mapping relationships between interface elements and the layers, for example, "text box 1", "text box 2", and the like, can be drawn on the layer 2, and icons such as "previous", "next", "start playing", and the like, can be drawn on the layer 6.
The extracting of the interface elements drawn in the design draft may be extracting image resources referred by the interface elements. For example, extracting the "previous" icon from the design refers to extracting the vector diagram referenced by the icon.
And, after parsing the design, the electronic device may also display a user interface, as exemplarily shown in fig. 4B, to present the deconstructed design to the user.
FIG. 4B illustrates the interactive functionality of deconstructed design provided by the user interface.
As shown in fig. 4B, the electronic device can display interface elements drawn in the design in the canvas 104, which can be displayed in the canvas 104 following the design. The interface elements are displayed on the canvas 104 in conformity with the design draft means that the style (pattern, color, shape, etc.) of the interface elements conforms to the design draft on the canvas 104, and the layout of the interface elements conforms to the design draft.
For example, as shown in FIG. 4B, following the draft design shown in FIG. 4A, in canvas 104: the interface elements 1A11, 1B21 and 1A13 are displayed side by side at the top of the page, the interface element 1C1 for showing the name of a song and the interface element 1C21 for showing the lyrics are sequentially and longitudinally displayed below the page, the interface elements 1D11, 1D12 and 1D13 for showing the playing progress are displayed side by side below the interface element 1C21, and the interface elements 1D121, 1D122, 1D123, 1D124 and 1D125 are displayed side by side at the bottom of the page. That is, the interface elements displayed in canvas 104 embody the page layout described by the design draft.
The difference from the design draft is that the page layout presented in canvas 104 is comprised of interface elements, which after being assigned atomic capabilities can respond to the design size selected by the user with adaptation capabilities. The design draft is a static image file that reflects only the layout of pages at the design size considered for drawing.
As shown in fig. 4B, the electronic device may further display identification information (name, number, and the like) of the interface elements in the canvas 104 in the interface element list 102, and specifically, may display each interface element in groups according to the layer to which it belongs.
To further enhance the association between the canvas 104 and the same interface element in the list 102, when a user selection of an interface element (e.g., the "previous" icon) in the list 102 is detected, the electronic device can highlight the interface element (e.g., the "previous" icon) in the canvas 104. Conversely, when a user is detected to select a certain interface element in the canvas 104, the electronic device can highlight the certain interface element in the list 102. Highlighting may include, but is not limited to, the following: frame thickening, background color changing, zooming display and the like. Therefore, a user can intuitively and efficiently know the relation among the layers, the interface elements and the layout, and UI design is facilitated.
As can be seen from fig. 4B, after the design is imported, various interface elements structured from the design are shown in the user interface, for example, various interface elements displayed in the canvas 104 following the design, and interface elements under various layers in the list 102. In this way, UI designers can be facilitated in selecting interface elements for atomic capability assignment.
FIGS. 5A-5B, 6A-6B illustrate interactive functions provided by a user interface to select interface elements, assign atomic capabilities.
As shown in fig. 5A and 5B, the electronic device may detect an operation of a user selecting an interface element. In response to the operation, the electronic device may display the user-selected interface element in the selected state and record the user-selected interface element.
For example, as shown in FIG. 5A, the user may continuously click on interface elements 1D1, 1D121, 1D122, 1D123, 1D124, 1D 125. For another example, as shown in FIG. 5B, the user may select (e.g., frame) interface elements 1D1, 1D121, 1D122, 1D123, 1D124, 1D125 at a time. The interface elements may be highlighted to indicate that the interface elements are in the selected state.
The user may choose to assign an atomic capability to a single interface element or may choose to assign an atomic capability to multiple interface elements. A single interface element.
Further, as shown in fig. 6A and 6B, after detecting the user selection of the interface element, the electronic device may also detect an operation of assigning an atomic capability to the selected interface element. In response to the operation, the electronic device can assign a layout adaptation rule specified by the atomic capability to the interface element selected by the user. Then, when the screen size changes, the selected group of interface elements performs layout adaptation according to the layout adaptation rules specified by the hiding capability, namely, the selected group of interface elements responds to the screen size change.
For example, as shown in fig. 6A, upon detecting a user selection of a set of interface elements 1D121, 1D122, 1D123, 1D124, 1D125, the electronic device may further detect an operation of the user assigning "hidden capabilities" to the selected set of interface elements. In response to this operation, the electronic device may assign a layout adaptation rule specified by the "hiding capability" to the interface element selected by the user.
Wherein assigning a "hiding capability" to the selected set of interface elements may comprise: relevant parameters for describing the "hiding capability" are set, such as hiding direction, hiding priority. For example, as shown in FIG. 6A, the hiding direction may be set by a "layout direction" option, and the hiding priority may be set by filling in a priority (e.g., the priority of interface element 1D23 is "1"). Then, when the screen size changes, the selected interface element 1D121, 1D122, 1D123, 1D124, 1D125 in the group performs layout adaptation according to the layout adaptation rule specified by the "hiding capability", for example, hiding the interface element 1D21 when the screen size decreases, that is, responding to the screen size change.
For another example, as shown in fig. 6B, upon detecting that the user has outlined a group of interface elements 1D1, 1D121, 1D122, 1D123, 1D124, 1D125, the electronic device may further detect that the user has assigned a "fold capability" to the selected group of interface elements. The interface element 1D1 is a container, and includes a group of interface elements 1D11, 1D12 and 1D 13. In response to this operation, the electronic device may assign a layout adaptation rule specified by "folding capability" to the interface element selected by the user.
Wherein assigning the "folding capability" to the selected set of interface elements may include: relevant parameters for describing the folding capability are set, such as folding direction, alignment mode, element spacing and reference size. For example, as shown in fig. 6B, the folding direction may be set by a "folding direction" option, the alignment may be set by a "horizontal alignment" or "vertical alignment" option, and the element pitch may be set by a "horizontal pitch" or "vertical pitch" option. Then, when the screen size changes, the selected interface elements 1D1, 1D121, 1D122, 1D123, 1D124, and 1D125 perform layout adaptation according to the layout adaptation rule specified by the "folding capability", for example, folding down when the screen size becomes smaller, that is, responding to the screen size change.
With reference to the interaction illustrated by examples in fig. 5A-5B, and 6A-6B, a user may assign atomic capabilities to all interface elements drawn in the design to obtain a responsive page layout that can adapt to screen size changes. The atomic capabilities such as "hiding capability", "folding capability", etc. will be described in detail later, and will not be expanded first.
FIGS. 7A-7C, 8A-8B illustrate interactive functionality provided by a user interface to view responsive layouts at different screen sizes.
As shown in fig. 7A-7C, the user interface exemplarily shown in fig. 3A-3B may further include: a screen size selection button. The electronic device may detect an operation of a user clicking a button, and in response to the operation, may display a plurality of screen size options. These multiple screen size options can be divided into the following categories: the "cell phone" option, the "tablet" option, the "watch" option, the "television" option, the "custom" option. Each category of screen size options may include a plurality of specific screen size options.
For example, as shown in fig. 7A, detecting that the user selects the "tablet" option, the electronic device may display a plurality of tablet screen size options, such as "tablet 1", "tablet 2", "tablet 3". When the user is detected to select the "tablet 1" option, the electronic device may display the screen size (i.e., screen resolution) of "tablet 1", e.g., "long: 2560px width: 1440px ", it is even possible to display the pixel density of" plate 1 ", for example" pixel density (dpi): 300".
For another example, as shown in fig. 7B, when the user is detected to select the "mobile phone" option, the electronic device may display a plurality of mobile phone screen size options, such as "mobile phone 1", "mobile phone 2", and "mobile phone 3". When the user is detected to select the option "cell phone 2", the electronic device may display the screen size of "cell phone 2", e.g., "long: 2340px width: 1080px ", even" cell phone 2 "pixel densities, such as" pixel density (dpi): 360".
Other options such as "watch," "television," etc. may also provide a number of specific screen sizes similar to fig. 5A, 5B.
In addition, as shown in FIG. 7C, the user interfaces exemplarily shown in FIGS. 3A-3B may also enable a user to manually set the resolution to freely define the screen size.
As shown in fig. 8A-8B, in response to a user operation to change the screen size, the electronic device can display a new page layout in the canvas 104 that is generated by adapting the new screen size according to the assigned atomic capabilities of the interface elements. Therefore, the user can check the output effect of the responsive layout under different screen sizes, visually feel whether the atomic capability allocated to the interface elements in the design draft is appropriate or not, and conveniently adjust the atomic capability of the interface elements by a UI designer so as to more efficiently obtain the responsive layout capable of adapting to various screen sizes.
Specifically, as shown in fig. 8A, when the screen size is detected to be changed from "mobile phone 2" (resolution of 2340px 1080px) to "tablet 1" (resolution of 2560px 1440px), that is, the horizontal resolution and the vertical resolution are both increased, the electronic device may generate a page layout adapted to "tablet 1" according to the atomic capability allocated to each interface element drawn in the design draft, and display the page layout in the canvas 104.
As can be seen by comparing the page layouts shown in fig. 4B and fig. 8A, the interface element group formed by the container 1D1 (in which the interface elements 1D11, 1D12 and 1D13 for showing the playing progress are arranged) and the interface elements 1D121, 1D122, 1D123, 1D124 and 1D125 is changed from two rows of separate displays to one row of separate displays as shown in fig. 6B, so that the horizontal resolution of the screen is larger.
Specifically, as shown in fig. 8B, when it is detected that the screen size is changed from "mobile phone 2" (resolution of 2340px 1080px) to "custom device 2" (resolution of 360px 480px), or when it is detected that the screen size is changed from "tablet 1" (resolution of 2560px 1440px) to "custom device 2" (resolution of 360px 480px), that is, the horizontal resolution and the vertical resolution are both reduced, the electronic device may generate a page layout adapted to "custom device 2" according to the atomic capability assigned to each interface element drawn in the design draft, and display the page layout in the canvas 104.
As can be seen from comparing the page layouts shown in fig. 4B and fig. 8B, the interface element group formed by the interface elements 1D121, 1D122, 1D123, 1D124, and 1D125 is allocated with hiding capability (the hiding direction is horizontal hiding) as shown in fig. 6A, and the interface elements 1D121 and 1D125 are hidden, so that the screen with smaller horizontal resolution can be adapted. To fit a screen with smaller longitudinal resolution, the interface element 1C21 for showing lyrics is hidden, because the interface element 1C1 for showing the song name and the interface element 1C21 for showing lyrics are assigned a hiding power as a group, the hiding direction being longitudinal hiding.
As can be seen from fig. 3A to 3B, fig. 4A to 4B, fig. 5A to 5B, fig. 6A to 6B, fig. 7A to 7C, and fig. 8A to 8B, the user interface for designing a responsive page provided in the embodiment of the present application may support a UI designer to simply and efficiently deconstruct a design draft and allocate atomic capability to interface elements drawn in the design draft, so as to efficiently convert the design draft into a responsive layout. And moreover, the UI designer is supported to check the adaptation conditions of the responsive layout under different screen sizes, the method is visual and efficient, the UI designer can modify the atomic capability timely and effectively, the responsive layout is continuously perfected, and the responsive layout capable of adapting to the multi-screen sizes can be simply, conveniently and quickly obtained.
In addition, as shown in fig. 9, the user interface for designing a responsive page provided by the embodiment of the present application may further support a user to export a code segment about the responsive page, which can be read and used by a software developer, so as to save the programming workload of the software developer. The electronic device running the displayed user interface may detect an operation by the user to export the code, such as entering the "file" option in main menu 101, clicking on the secondary menu "save as", selecting an operation to save the responsive layout shown in menu 104 as the code. In response to the operation, the electronic device may convert the responsive layout into a code describing the responsive layout and save the code. The application does not limit the type of the code. The electronic device converts the atomic capability (parameter defining layout adaptation rule) used by the responsive layout into code according to the code writing rule (different encoding rules of different types).
The following describes some exemplary atomic capabilities provided by embodiments of the present application. These several typical atomic capabilities can be divided into two broad categories: one type may be referred to as an adaptive change capability, including a zoom capability and a stretch capability, where the size and position of the interface elements themselves may be changed to accommodate changes in the page as the page size changes. Another type may be referred to as an adaptive layout capability, including a hiding capability, a folding capability, an averaging capability, a proportion capability, and an extension capability, and as the page size changes, the internal layout of the interface elements may be changed to accommodate the change in the page.
(1) Scaling capability: the proportion of the interface element (e.g., width to height ratio) is maintained constant and the interface element as a whole is scaled to accommodate the dimensional changes of the container of the interface element.
The adjustable parameters of the zoom capability may include, but are not limited to:
1. available space-horizontal/vertical percentage;
2. scaling limit-max/min.
Wherein the "available space" determines the scaled width or height of the interface element by means of a relative reference. For example, the horizontal/vertical percentage may be used to determine the maximum space (maximum width/maximum height) of the interface element after scaling, with the container of the interface element as a reference. "zoom limit" may be used to avoid that the zoomed interface element affects the user experience because it is too large or too small. The "available space" and "zoom limit" parameters may affect the size of the interface element.
Fig. 10A simply illustrates the zoom capability. The aspect ratio of the interface element S in fig. 10A is 1: 1. As the container size (screen size in FIG. 10A) changes, the interface element S as a whole remains scaled at a 1:1 ratio, and the text within it is also scaled at a 1:1 ratio. For example, from (a) to (C) in fig. 10A, the interface element S as a whole (including text therein) is enlarged with a scale of 1:1 maintained; from (C) to (a) in fig. 10A, the interface element S as a whole (including text therein) is reduced while maintaining the scale of 1: 1.
The implementation of the scaling capability may be as follows: first, the overall size of the scaled interface element is calculated from the "available space" parameter. Then, whether the size (width or height) of the interface element after zooming exceeds the allowable range is judged according to the parameter of 'zooming limit', and if the size (width or height) of the interface element after zooming exceeds the allowable range, zooming is stopped.
The zooming process can also recur to the internal layout of the interface elements, namely, all internal interface elements are traversed, and different treatments can be made according to different design attributes of the internal interface elements:
1. vector graphics-center scaling according to scaling percentage;
2. text-the font size is scaled according to the scaling percentage;
3. bounding box-according to the zoom percentage, the bounding box thickness is scaled;
4. shadow-the shadow range is scaled according to the scaling percentage;
5. blur-the blur radius is scaled according to the scaling percentage.
FIG. 10B illustrates an example of an interface element using zoom capability. Dial 10A01 this interface element uses zoom capability, with dial 10A01 in container 10A01 of the entire alarm clock page. The relevant parameters of the zoom capability are as follows: "available space" is the vertical percentage having a value of 1/3 of container 10A 01; the "scaling limit" is the minimum value of the height, which is 100 px.
In (A), the dial 10A02 has a width to height ratio of 1:1, and both the widths and the resolutions are 933 px. When the screen size becomes smaller, such as the screen height is reduced from 3120px in (a) to 2560px in (B), the dial 10a02 is reduced while keeping the width-to-height ratio at 1:1, and the width and the high resolution of the reduced dial 10a02 are all 746px, which meets the parameters of "available space" and "zoom limit". At the same time, the interface elements in dial 10a02 are also scaled, e.g., the numbers in the dial are scaled from 12sp to 10sp, and the thickness of the minute hand is scaled from 20px to 16 px.
The zooming capability is more suitable for interface elements with unchanged integral scale, such as interface elements of images or graphics classes, and can not cause graphics or image deformation.
(2) Stretching ability: the interface element may be stretched in the horizontal/vertical direction to accommodate dimensional changes in the container of the interface element.
Adjustable parameters of the stretching capacity may include, but are not limited to:
1. stretching direction: horizontal/vertical;
2. fixing the edge distance: up/down/left/right;
3. stretching limit: maximum/minimum.
Where "stretch direction" may dictate the direction of stretch, for example, horizontal stretch may change width and vertical stretch may change height. "fixed margins" may be used to define the margins between the stretched interface element and its container, defining the position of the stretched interface element relative to its container. The "stretch limit" may be used to avoid that the interface element in a certain direction affects the user experience due to over-stretching. The parameters of "fixed margin" and "stretching limit" may affect the size of the interface elements, and the "stretching direction" and "fixed margin" may in turn affect the layout of the interface elements.
Fig. 11A simply shows the stretching ability. When the container size (screen size in fig. 11A) changes, the interface element S is stretched in the horizontal direction. For example, from (a) to (C) in fig. 11A, the interface element S is stretched in the horizontal direction, the width becomes larger, and the left-right margin d between the interface element S and the container remains unchanged; from (C) to (a) in fig. 11A, the interface element S is stretched in the horizontal direction, the width becomes smaller, and the left-right margin d between the interface element S and the container is kept constant.
The stretching capacity can be achieved as follows: first, the length of the stretched interface element in the stretching direction is calculated from the "stretching direction" parameter and the "fixed margin" parameter. For example, if the stretching direction is the horizontal direction, the width of the stretched interface element can be calculated from these two parameters; if the stretching direction is the vertical direction, the height of the stretched interface element can be calculated from these two parameters. And then, judging whether the length of the stretched interface element in the stretching direction exceeds an allowable range according to the stretching limit parameter, and stopping stretching if the length of the stretched interface element exceeds the allowable range.
FIG. 11B illustrates an example of an interface element using stretch capability. The interface element of table 12a03 uses stretch capability. The relevant parameters of the stretching capacity are as follows: the "stretching direction" is the horizontal direction; "fixed margins" are left and right margins, which have a value of 96 px. When the screen size becomes larger, as the screen width changes from (a) to (B), the list 12a03 will fit the wider screen for horizontal stretching, and keep the left and right margins of the screen constant at 96 px.
The ability to stretch is more appropriate for horizontal or vertical dimensions to allow for changing interface element usage, such as lists, without impacting the user experience.
(3) Hiding capability: a display priority for a set of interface elements is defined to accommodate a change in size of a container of the set of interface elements.
Adjustable parameters of the hiding capability may include, but are not limited to:
1. hiding direction: horizontal/vertical;
2. hiding sequence: and defining the priority.
The "hiding direction" may specify a hiding direction, for example, hiding the internal interface element in the horizontal direction may decrease the width, and hiding the internal interface element in the vertical direction may decrease the height. The "hiding order" may be used to specify a display priority for internal interface elements, with higher display priorities being hidden later and lower display priorities being hidden first. The "hiding order" may affect the number of interface elements and the "hiding direction" may affect the layout of interface elements.
Fig. 12A simply shows the hiding capability. The numbers in the figure represent the display priorities (hiding orders) of different interface elements, and the smaller the number, the higher the display priority, and the later the hiding order. When the width of the screen size becomes smaller, as from (C) to (a), the interface element with low display priority is hidden, such as hiding the interface element "3" first and then hiding the interface element "2", to fit the increasingly narrower screen. When the width of the screen size is larger, as from (A) to (C), the hidden interface elements are displayed successively to fit the wider screen.
The implementation of the hiding capability may be as follows: firstly, determining the number of elements which can be accommodated in the available space of the selected hiding direction according to the parameter of the hiding direction and the parameter of the hiding sequence, and then determining which interface elements are displayed and which interface elements are hidden.
FIG. 12B illustrates one example of interface element use hiding capability. The set of interface elements dial 14A01, alarm list 14A02, bottom menu 14A03 within container 14A00 use hiding capabilities. The relevant parameters of the hiding power are as follows: the "hidden direction" is the vertical direction; the "hiding order" is: the dial 14A01 is hidden first, the bottom menu 14A03 times later, and the alarm list 14A02 is hidden last. When the screen size becomes smaller, as the screen height changes from (a) to (B), the dial 14a01 will be hidden.
FIG. 12C illustrates another example of interface element use hiding capability. (A) Is a standard player page. The set of interface elements of cover 15A1, song title 15A2, play control menu 15A3 within container 15A0 use hiding capabilities. The relevant parameters of the hiding power are as follows: the "hidden direction" is the vertical direction; the "hiding order" is: the cover 15a1 is hidden first, the song title 15a2 times, and the play control menu 15A3 is hidden last. Meanwhile, the play control menu 15A3 also uses hiding capability, and the interface elements in the play control menu 15A3 include: favorite 15A31, last 15A32, play 15A33, next 15A34, comment 15A 35. The relevant parameters of the hiding capability of the play control menu 15a3 are as follows: the "hidden direction" is the horizontal direction; the "hiding order" is: the favorites 15A31 and comments 15A35 were first hidden, last 15A32 and next 15A34, and played 15A 33.
When the screen size becomes small, as the screen width height becomes smaller from (a) to (C), the cover 15a1 is hidden first in the vertical direction, the song title 15a2 times, and the favorite 15a31 and the comment 15a35 in the play control menu 15A3 are hidden first in the horizontal direction, the previous 15a32 times, and the next 15a34 times. When the screen size becomes larger, for example, the screen width and height become smaller from (C) to (a), the song title 15a2 and the cover 15a1 are sequentially displayed in the vertical direction; in the horizontal direction, the favorite 15a31 and the comment 15a35 in the play control menu 15A3, the previous 15a32 and the next 15a34 are sequentially displayed in the display order.
The hiding capability is suitable for the use of container type interface elements with limited available space in the horizontal direction or the vertical direction, such as a label bar, a menu, a button bar and the like, and internal interface elements which are most concerned by a user can be preferentially displayed.
(4) Folding ability: provision is made for adjusting the layout of a set of interface elements according to the available space and the proportion of the containers for this set, changing a single-row arrangement into a double-row arrangement or a multi-row arrangement.
Adjustable parameters of the walk-back capability may include, but are not limited to:
1. folding direction: forward/reverse;
2. alignment mode: left alignment/right alignment/center alignment/top alignment/bottom alignment/center alignment;
3. element spacing: horizontal pitch/vertical pitch;
4. reference size: the size of each element.
The "folding direction" may specify the folding direction, and may affect the internal layout. The "alignment" may be used to specify how the rows are aligned after folding, which may affect the internal layout. "element spacing" may be used to specify the distance between internal interface elements. The "reference size" may be used to specify the dimensions of the internal interface element. Several of these parameters may affect the layout of the interface element, and the "reference size" may affect the size of the interface element.
Fig. 13A simply illustrates the fold capability. When the width of the screen size becomes smaller, as from (C) to (a), the 4 interface elements change from a single row arrangement to a double row arrangement to fit the increasingly narrower screen. When the width of the screen size becomes larger, such as from (A) to (C), the double-row arrangement is changed into the single-row arrangement again to adapt to the increasingly wider screen.
The implementation process of the folding capability can be as follows: firstly, determining the number of elements which can be accommodated in the available space of each line according to the 'folding direction' parameter and the 'reference size' parameter, determining which interface elements are displayed in each line, and then determining the position of the interface elements in each line according to the 'alignment mode' parameter and the 'element spacing' parameter.
FIG. 13B illustrates one example of an interface element using the fold-down capability. The set of interface elements dial 17A1, container 17A2 in container 17A0 uses a walk-through capability with an alarm column and a bottom menu in container 17A 2. When changing from landscape to portrait, as from (B) to (a), the single row of dials 17a1 and containers 17a2 are folded into a double row for display. When the vertical screen is changed into the horizontal screen, for example, from (A) to (B), the double-row arrangement is changed into the single-row arrangement.
FIG. 13C illustrates another example of an interface element using a fold-over capability. The interface element of container 18a0 uses a turndown capability with 3 cards in container 17a 0. When the width of the screen is changed, the folded line display is carried out according to the number of cards which can be accommodated by the width of the screen.
(5) The equipartition capacity is as follows: a consistent spacing between a set of interface elements is specified to accommodate dimensional changes of the containers of the set of interface elements.
Adjustable parameters for the sharing capability may include, but are not limited to:
1. direction sharing: horizontal/vertical;
2. the edge distance is defined as follows: contain/not contain edges;
3. spacing limit: maximum/minimum.
Where the "equipartition direction" may specify in which direction the spacing between internal interface elements is consistent. "margin definition" may be used to specify that the spacing between internal interface elements is not counted against the width of the element's bounding box, and is generally an optional parameter. The "spacing limit" may be used to avoid internal interface elements that are too close or too far apart from each other to affect the user experience. Several of these parameters may affect the layout of the interface elements.
Fig. 14A simply shows the equipartition capability. The spacing between the internal interface elements is always consistent no matter how the width of the screen changes, and the internal interface elements are arranged at equal intervals in the horizontal direction. However, the spacing will depend on the width of the screen.
The sharing capability can be realized as follows: firstly, determining the assignable sum of the distances according to the 'equipartition direction' parameter and the 'edge distance definition' parameter, further calculating the distances among the elements, judging whether the distances among the elements exceed a limit value according to the 'distance limit' parameter, and if the distances exceed the limit value, taking the limit value as the element distance.
FIG. 14B illustrates one example of an interface element using an equal share capability. The sharing capability is used by the set of interface elements alarm clock 20A1, world clock 20A2, stopwatch 20A3 and timer 20A4 in container 20A 0. As the screen width becomes smaller, as from (B) to (a), the interface elements within the container 20a0 are arranged at equal intervals but at smaller intervals. As the screen width becomes larger, as from (a) to (B), the interface elements in the container 20a0 are arranged at equal intervals but the intervals become larger.
(6) Ratio capacity: the proportion of a set of interface elements in the space of its container is specified.
Tunable parameters for the duty ratio capability may include, but are not limited to:
1. the ratio direction is as follows: horizontal/vertical;
2. ratio definition: percentage of each element.
Wherein "proportion direction" may specify in which direction the proportion of a given interface element in a container is. The "ratio definition" may specify the size of the proportion of a given interface element. The "ratio direction" parameter may affect the layout of the interface element, and the "ratio definition" parameter may affect the size of the interface element.
Fig. 15A simply shows the duty ratio capability. In the horizontal direction, the space ratio in the container is always distributed according to 1:2, the ratio of the interface element A in the container is 1/3, the ratio of the interface element B in the container is 2/3, and the ratio is not influenced by the change of the screen width.
The implementation of the duty ratio capability may be as follows: the allocation ratio of the available space of the container in the direction of the ratio is calculated on the basis of the "direction of ratio" parameter and the "ratio definition" parameter.
FIG. 15B illustrates one example of interface element usage capture capability. The set of interface elements dial 22A1, alarm clock 22A2 within container 22A0 use the horological capabilities. The relevant parameters of the fractional capability are as follows: the "occupation ratio direction" is the horizontal direction; the "ratio definition" is: the dial 22A1 and alarm clock 22A2 each account for 50% of the space of the container. When the width of the screen changes, the occupation ratios of the dial 22A1 and the alarm clock 22A2 in the container 22A0 are all kept unchanged at 50%.
(7) Extension ability: the number of interface elements that can be displayed inside a container that defines a set of interface elements varies according to the container space.
Adjustable parameters of the extension capability may include, but are not limited to:
1. the extending direction: horizontal/vertical;
2. and (3) space definition: a specific numerical value;
3. exposing the features: yes/no;
4. spacing limit: a minimum value.
Where "extension direction" may specify in which direction an interface element is added within the container. "spacing definition" may be used to specify the distance between internal interface elements. An "exposure feature" may be used to specify whether a portion of an interface element is allowed to be exposed (displayed). The "spacing limit" may be used to avoid internal interface elements that are too close or too far apart from each other to affect the user experience. Several of these parameters can affect the internal layout of the container. Parameters that affect the number of interface elements displayed within a container may include: "pitch definition", "exposed feature", "pitch limit". These several parameters may affect the layout of the interface elements, and the "pitch definition", "exposed features", and "pitch limit" may affect the number of interface elements.
Fig. 16A simply shows the extension capability. In the horizontal direction, as the screen width becomes larger, more internal interface elements are exposed.
The process of extending the capability may be implemented as follows: first, the number of elements that can be accommodated in the extending direction is determined based on several parameters of "pitch definition", "exposed feature", and "pitch limit". When the container space can contain more elements along with the increase of the screen size, the first element is copied as supplement, when the space is reduced, the copied elements are preferentially deleted, and when the space is further reduced, the existing interface elements (namely non-copied interface elements) are hidden.
FIG. 16B illustrates one example of interface element usage extension capability. Container 24a0, an interface element that uses extensibility, container 24a0 has multiple chart-like interface elements, and as the screen width becomes larger, such as from (a) to (B), container 24a0 can hold more elements, duplicating the existing interface elements to fill the increased available space. When the screen width becomes smaller, as in (B) to (a), the interface element copied before is preferentially deleted.
Without being limited to the above-described several typical atomic capabilities and related parameters associated with the atomic capabilities, the method for generating the responsive page provided by the embodiment of the present application may also use other atomic capabilities to define other related parameters for the atomic capabilities.
In the embodiment of the present application, different atomic capabilities may be used in combination, that is, one interface element may use multiple atomic capabilities. The interface elements can also be nested with each other, namely, the large container has atomic capability, and the small container in the large container also uses the atomic capability, so that the screen can be in linkage effect when the size of the screen is changed. The combination and nesting of atomic capabilities can bring more flexibility to responsive page design, can cover more screen sizes, and can also be suitable for more application scenes, such as horizontal and vertical screen conversion scenes, split screen scenes, folding screen scenes and the like.
However, atomic capability combining, nesting also presents problems with conflicts and duplicate computations. For example, several interface elements "previous", "play", "next" in fig. 18A are assigned with zoom capability, and these several interface elements are assigned with hide capability. When the screen size changes, the available space in the horizontal direction of the container 25a01 of several interface elements can display which interface elements, not only in relation to the size of each element after zooming (i.e. in relation to the parameter relating to the zoom capability), but also in relation to the "order of hiding" parameter of the hiding capability management used by the container 25a 01.
To avoid the problems of conflict and repeated computation caused by multi-atomic capability combination and nesting, when the atomic capability is used in combination, the responsive layout computation engine in fig. 2 may first analyze the design draft and the atomic capability assigned to each interface element in a comprehensive manner, and then act on the interface elements. As shown in fig. 17, the main process may include:
And 3, after obtaining the target design size, the response type layout calculation engine carries out layout calculation according to the parameters classified in the step 2 under the target design size. The target design size may be entered by the UI designer through the user interface shown in fig. 3A. The layout calculation can be performed by first performing the size calculation, then performing the number calculation, and finally performing the layout calculation. Specifically, the responsive layout calculation engine may first determine the size of the interface element to be displayed by comprehensively considering parameters affecting the size of the interface element. And then, on the premise of determining the size of the interface elements, comprehensively considering parameters influencing the number of the interface elements, and determining the number of the interface elements capable of being accommodated. And finally, determining the layout according to parameters influencing the layout of the interface elements on the premise of determining the size and the number of the interface elements.
And step 4, finally, according to the size of the interface elements to be effective under the determined target design size, the number of the interface elements and the layout of the interface elements, applying the interface elements to the corresponding interface elements to form the page layout adaptive to the target design size.
18A-18B illustrate two examples of atomic capability combining, nesting.
As shown in fig. 18A, several interface elements "previous", "play", and "next" are assigned with zooming capability, and these interface elements are assigned with hiding capability and sharing capability. That is, the scaling capability, the averaging capability, and the hiding capability are used simultaneously in the example of fig. 18A. When the size of the screen changes, the size of the interface elements can be calculated according to the parameters of the available space and the zoom limit of the zoom capability, then the interface elements to be displayed are determined according to the determined size of the interface elements and the hidden sequence parameter of the hidden capability, and finally the layout of the interface elements is determined according to the related parameters of the average capability and the parameters influencing the layout in other two capabilities.
As shown in fig. 18B, several interface elements in container 28a0 combine to use a fold-down capability and a share-even capability. When the size of the screen changes, the number of elements which can be displayed in each row can be determined according to parameters which affect the number of the elements, such as the 'folding direction' and 'reference size' of the folding capability, and then the layout of the interface elements is determined according to parameters which affect the layout of the elements, such as the relevant parameters of the uniform division capability, the 'alignment mode' and 'element spacing' of the folding capability.
It can be seen that the method for generating the responsive page provided by the embodiment of the application can support the user to use the atomic capability in combination, and convert the common design draft into the responsive layout, so that the UI designer can efficiently design the responsive page capable of adapting to various screen resolutions.
Thus, after the UI designer completes a design page of a standard size, the UI designer can: A. a set of interface elements (also referred to as function modules) on a frame page, the selected interface elements being assigned one or more atomic capabilities; B. the relevant parameters of the atomic capability (such as element spacing, reference size, etc.) are entered or adjusted to generate a responsive page. The different atomic capabilities can be combined and nested, more flexibility can be brought to the design of the responsive page, more screen sizes can be covered, and more application scenes can be suitable.
In addition, after the UI designer finalizes the adaptation effect of one responsive page to different screen sizes, the parameter code segments of the atomic capability used by the page or the part of the page can be derived and delivered to developers for use, and further the programming workload is saved.
In order to further improve the UI development efficiency, after the design draft is imported, the electronic equipment can be classified into different layout types according to the typesetting style of the design draft, intelligent judgment is carried out, appropriate atomic capability is matched, and the responsive page is quickly generated. Specifically, the electronic device may input the design into a trusted (trained) neural network to obtain an output atomic capability that matches each interface element. The neural network may be trained using a plurality of training samples, each of which may include an input data portion and an output data portion, wherein the input data portion is a design and the output data portion is an atomic capability used by a page of the design. The atomic capability of the output data portion representation makes the page a responsive page that can fit a number of different screen sizes.
The following describes related devices related to embodiments of the present application. Fig. 19 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may be a personal computer, a tablet computer, a folding screen mobile phone, etc.
As shown in fig. 19, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and the like. Wherein:
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is used to detect a charging input from the charger. The charger may be a wireless charger or a wired charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 detects the input of the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory 120, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc.
The antenna is used for transmitting and detecting electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 detects an electromagnetic wave via the antenna 2, performs frequency modulation and filtering processing on an electromagnetic wave signal, and transmits the processed signal to the processor 110. The wireless communication module 160 can also detect a signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it. Illustratively, the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the like.
In some embodiments, the antenna of the electronic device 100 and the wireless communication module 160 are coupled such that the electronic device 100 may communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: design draft typesetting identification, text understanding and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may detect a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenarios (e.g., time reminding, detected information, alarm clock, game, etc.) may also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
When the electronic device 100 is implemented as a mobile communication device such as a mobile phone, the wireless communication module 160 may also include a mobile communication module. The mobile communication module may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device 100. The mobile communication module may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module can detect the electromagnetic wave by the antenna, filter, amplify and the like the detected electromagnetic wave, and transmit the electromagnetic wave to the modulation and demodulation processor for demodulation. The mobile communication module can also amplify the signal modulated by the modulation and demodulation processor and convert the signal into electromagnetic wave to radiate the electromagnetic wave through the antenna. In some embodiments, at least part of the functional modules of the mobile communication module may be provided in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the detected electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio output device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 110 and may be located in the same device as the mobile communication module or other functional modules.
When the electronic device 100 is implemented as a mobile communication device such as a mobile phone, the electronic device 100 may further include a SIM card interface, which may be used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into and pulled out of the SIM card interface. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface can support a Nano SIM card, a Micro SIM card, a SIM card and the like. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
When the electronic device 100 is implemented as a portable device such as a mobile phone, a tablet computer, a notebook computer, etc., the electronic device 100 may further include the sensor module 180. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, and the like. Wherein,
the pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode, and can determine whether an object is near the electronic device 100. The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on. The temperature sensor 180J is used to detect temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194.
The above is a detailed description of the embodiments of the present application taking the electronic device 100 as an example. It should be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. Electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 exemplarily shown in fig. 19 may display the various user interfaces described in the above embodiments through the display screen 194. The electronic apparatus 100 may detect a user operation in each user interface through an input device such as a keyboard. In some embodiments, the electronic device 100 may detect non-touch gesture operations through the camera 193 (e.g., 3D camera, depth camera).
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
It is to be understood that the terminology used in the above embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
Claims (11)
1. A method of generating a responsive page, comprising:
the electronic equipment detects the operation of importing the design draft;
the electronic equipment displays interface elements drawn in the design draft in a first page layout, wherein the first page layout is described by the design draft;
the electronic device detects an operation of selecting a first layout adaptation rule for a first interface element; the first layout adaptation rule is to adjust one or more of: the position, size and layout of the first interface element;
the electronic equipment allocates the first layout adaptation rule for the first interface element;
the electronic equipment detects an operation of selecting a second screen size, wherein the second screen size is different from the first screen size;
the electronic equipment displays the interface elements drawn in the design draft in a second page layout, wherein the layout of the first interface elements in the second page layout is determined according to parameters for describing the first layout adaptation rules; the second page layout is different from the first page layout.
2. The method of claim 1, wherein the first layout adaptation rule comprises one or more of: the first type of layout adaptation rules are used for adjusting the position and the size of the first interface element; the second type of layout adaptation rule is used for adjusting the layout of the first interface element.
3. The method of claim 1 or 2, wherein the first interface element comprises one or more interface elements.
4. A method according to any of claims 1-3, wherein the first layout adaptation rule comprises one or more layout adaptation rules.
5. The method of any of claims 1-4, wherein after the electronic device detects an operation to import a design, the method further comprises: the electronic device displays a plurality of layout adaptation rules, including the first layout adaptation rule.
6. The method of any one of claims 1-5, further comprising:
if the first interface element is a plurality of interface elements and the first layout adaptation rule is a plurality of layout adaptation rules, then:
the description parameters for the plurality of layout adaptation rules are divided into the following categories of parameters: the parameters are used for determining the size of the interface elements, the number of the interface elements and the layout of the interface elements;
under the condition that the size of the first interface element and the number of interface elements capable of being displayed in a container where the first interface element is located are determined, determining the layout of the first interface element according to the parameters for determining the layout of the interface elements;
the size of the first interface element is determined according to the parameter for determining the size of the interface element, and the parameter for determining the number of the interface elements is determined according to the parameter for determining the number of the interface elements.
7. The method of any of claims 1-6, wherein after the electronic device detects an operation to import a design, the method further comprises:
the electronic equipment displays a first list, and interface elements drawn in the design draft in the first list are grouped according to layers.
8. The method of any one of claims 1-7, further comprising:
the electronic device detects an operation of a user to export a code segment;
the electronic equipment converts the first layout adaptation rule assigned to the first interface element and the first layout adaptation rule for describing the first layout adaptation rule into code fragments;
the electronic device outputs the code snippet.
9. The method of any one of claims 1-8, wherein the first layout adaptation rule comprises one or more of the following layout adaptation rules: the method comprises the following steps of scaling rules, stretching rules, hiding rules, folding rules, sharing rules, proportion rules and extension rules;
wherein the scaling rule is used for stipulating that the proportion of the interface element is unchanged, and stretching the interface element according to the proportion to adapt to the size change of the container of the interface element;
the stretching rules are for specifying that the interface element is stretched in a user-selected stretching direction to adapt to a dimensional change of a container of interface elements;
the hiding rule is to specify a display priority for a set of interface elements to accommodate a change in size of a container of the set of interface elements;
the folding rule is used for regulating the group of interface elements to be arranged in a double-row arrangement or a multi-row arrangement from a single-row arrangement or a double-row arrangement or a multi-row arrangement according to the available space of the container of the group of interface elements;
the equipartition rule is used for stipulating that the intervals among a group of interface elements are kept consistent;
the space ratio rule is used for specifying the space ratio of a group of interface elements in a container of the group of interface elements;
the extension rule is used for defining that the number of interface elements which can be displayed inside a container of a group of interface elements changes according to the change of the space size of the container.
10. An electronic device comprising a display, a memory, and a processor coupled to the memory, a plurality of applications, and one or more programs; wherein the processor, when executing the one or more programs, causes the electronic device to implement the method of any of claims 1-9.
11. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010949363.5A CN114168128A (en) | 2020-09-10 | 2020-09-10 | Method for generating responsive page, graphical user interface and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010949363.5A CN114168128A (en) | 2020-09-10 | 2020-09-10 | Method for generating responsive page, graphical user interface and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114168128A true CN114168128A (en) | 2022-03-11 |
Family
ID=80475730
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010949363.5A Pending CN114168128A (en) | 2020-09-10 | 2020-09-10 | Method for generating responsive page, graphical user interface and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114168128A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114610295A (en) * | 2022-03-22 | 2022-06-10 | 云粒智慧科技有限公司 | Layout method, device, equipment and medium for page container |
CN114861247A (en) * | 2022-07-06 | 2022-08-05 | 广东时谛智能科技有限公司 | Method, device, equipment and storage medium for generating shoe body model based on simple design |
CN116501435A (en) * | 2023-06-28 | 2023-07-28 | 北京尽微致广信息技术有限公司 | Layout processing method and device and computer storage medium |
CN117667196A (en) * | 2024-02-01 | 2024-03-08 | 宁波沃尔斯软件有限公司 | UXUI efficient collaboration low-code method based on intermediate representation model |
-
2020
- 2020-09-10 CN CN202010949363.5A patent/CN114168128A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114610295A (en) * | 2022-03-22 | 2022-06-10 | 云粒智慧科技有限公司 | Layout method, device, equipment and medium for page container |
CN114861247A (en) * | 2022-07-06 | 2022-08-05 | 广东时谛智能科技有限公司 | Method, device, equipment and storage medium for generating shoe body model based on simple design |
CN114861247B (en) * | 2022-07-06 | 2022-12-30 | 广东时谛智能科技有限公司 | Method, device, equipment and storage medium for generating shoe body model based on simple design |
CN116501435A (en) * | 2023-06-28 | 2023-07-28 | 北京尽微致广信息技术有限公司 | Layout processing method and device and computer storage medium |
CN116501435B (en) * | 2023-06-28 | 2023-09-12 | 北京尽微致广信息技术有限公司 | Layout processing method and device and computer storage medium |
CN117667196A (en) * | 2024-02-01 | 2024-03-08 | 宁波沃尔斯软件有限公司 | UXUI efficient collaboration low-code method based on intermediate representation model |
CN117667196B (en) * | 2024-02-01 | 2024-04-16 | 宁波沃尔斯软件有限公司 | UXUI efficient collaboration low-code method based on intermediate representation model |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109299315B (en) | Multimedia resource classification method and device, computer equipment and storage medium | |
CN110111787B (en) | Semantic parsing method and server | |
CN110597512B (en) | Method for displaying user interface and electronic equipment | |
CN111738122B (en) | Image processing method and related device | |
CN111669459B (en) | Keyboard display method, electronic device and computer readable storage medium | |
CN110262788B (en) | Page configuration information determination method and device, computer equipment and storage medium | |
CN114168128A (en) | Method for generating responsive page, graphical user interface and electronic equipment | |
CN111190681A (en) | Display interface adaptation method, display interface adaptation design method and electronic equipment | |
WO2021013132A1 (en) | Input method and electronic device | |
CN112114912A (en) | User interface layout method and electronic equipment | |
CN110377204B (en) | Method for generating user head portrait and electronic equipment | |
CN110830645B (en) | Operation method, electronic equipment and computer storage medium | |
CN114816610B (en) | Page classification method, page classification device and terminal equipment | |
CN112269853A (en) | Search processing method, search processing device and storage medium | |
CN114444000A (en) | Page layout file generation method and device, electronic equipment and readable storage medium | |
WO2022095906A1 (en) | Key mapping method, electronic device, and system | |
CN115964231A (en) | Load model-based assessment method and device | |
WO2021196980A1 (en) | Multi-screen interaction method, electronic device, and computer-readable storage medium | |
WO2024067551A1 (en) | Interface display method and electronic device | |
CN115268731A (en) | Method for processing service card and electronic equipment | |
CN110728167A (en) | Text detection method and device and computer readable storage medium | |
CN111880661A (en) | Gesture recognition method and device | |
CN114201978A (en) | Method and related equipment for translating interface of application program | |
CN113343709B (en) | Method for training intention recognition model, method, device and equipment for intention recognition | |
CN115933952B (en) | Touch sampling rate adjusting method and related device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |