Disclosure of Invention
The present disclosure provides a loading method and a system for a dynamic demonstration interface, which starts a computer program to read an image required to be loaded by a user interface according to interface layout parameters through a processor; the processor performs interface assembly according to the relative position information of each image on the user interface in the interface layout parameters and the size of the display interval to form a first user interface; performing edge transition on the first user interface to obtain a second user interface; adapting the second user interface according to the size of the display screen; and performing interface rendering and drawing the second user interface so as to display the second user interface in the display.
An object of the present disclosure is to provide a loading method and a system of a dynamic presentation interface, wherein the loading method depends on a terminal device, the terminal device includes a display, a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and the computer program includes an image and an interface layout parameter to be loaded; the interface layout parameters are the relative position information of the images required by the user interface on the user interface (namely, the relative position of each image on the user interface), the size of the display interval (namely, the size of the occupied display interval of each image on the user interface) and the switching state conversion of the user interface; the switching state of the user interfaces is converted into parameters from one user interface to another;
the loading method comprises the following steps:
s100: the processor starts a computer program to read an image to be loaded on the user interface according to the interface layout parameters;
s200: the processor performs interface assembly according to the relative position information of each image on the user interface in the interface layout parameters and the size of the display interval to form a first user interface;
s300: performing edge transition on the first user interface to obtain a second user interface;
s400: adapting the second user interface according to the size of the display screen;
s500: and performing interface rendering and drawing the second user interface so as to display the second user interface in the display.
Further, in S200, the method for assembling the interface into the first user interface by the processor according to the relative position information of each image on the user interface and the size of the display section in the interface layout parameters includes the following steps:
s201: the processor reads the image to be loaded;
s202: loading the images to be loaded into the size area of the display interval of the user interface according to the relative position information;
s203: a first user interface is obtained.
Further, in S300, the method for performing edge transition on the first user interface to obtain the second user interface includes the following steps:
representing the first user interface with I, phi being the already loaded image area in the first user interface, omega being the blank area in the first user interface except for the boundary,for the boundary of each image according to the interface layout parameters in the first user interface, p is the pixel point with the largest transition value on the boundary, ψ p Is a rectangular domain centered on the p point, n p For p-point and->Orthogonal unit vectors;
s301: marking boundaries for the relative position information of the images required by the user interface on the user interface and the size of the display interval according to the interface layout parameters
S302: calculating the transition values of all pixel points on the boundary: for boundariesAny pixel point p on the display screen, and a rectangular domain ψ is determined by taking the p point as the center p The transition value P (P) of P point is calculated as follows: p (P) =c (P) ×d (P);
wherein, C (p) is a confidence, which is defined as follows:
|Ψ p the I is the total number of pixel points in the rectangular domain;
the initial values of C (p) are as follows: let C (p) =0 for all pixels in Ω, and let C (p) =1 for all pixels in Φ;
d (p) is a data item of a pixel, defined as:
wherein α is a pixel value; n is n p Is at the p point and boundaryOrthogonal unit vectors; />A gradient vector representing the point p, i.e. the direction in which the color change is greatest;
the data item of the pixel can carry out smooth transition along the gradient vector direction, because in the gradient vector direction, the data item of the pixel takes the maximum value, and the value of C (p) is the minimum near the point p as the center, so that the transition value is correspondingly small; so that noise of the first user interface is suppressed when smoothly transitioning along the gradient vector direction, so that texture of each image edge in the first user interface presents smooth continuity;
s303: constructing a transition block ψ with the pixel point with the largest transition value as the center p And scanning the matching module ψ in the already loaded image area Φ in the first user interface q : scan ψ p The most similar module ψ' q I.e. the module with the smallest euclidean distance, i.e.: psi' q =argmind(Ψ p ,Ψ q ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein the Euclidean distance isi. j are modules (ψ) p 、Ψ q ) A horizontal and vertical coordinate value of the middle pixel;
s304: updating confidenceDegree: c (p) =c (q')q'∈Ψ p ∩Ω,q'≠q;
S305: let ψ' q Copy of corresponding pixel points to ψ p In (a) and (b);
s306: iteratively executing steps S302 to S305 until all pixel points on the boundary have undergone edge transition to stop iteration; thus making an edge transition to the first user interface results in a second user interface.
Further, in S400, the method for adapting the second user interface according to the size of the display screen is: the scaled second user interface is the size of the display screen (screen width height size) of the terminal device.
Further, in S500, the method of performing the interface rendering and drawing the second user interface to display the second user interface on the display includes: and displaying and outputting a second user interface in any one display of the tablet personal computer, the mobile phone, the intelligent home, the navigation locator, the meal ordering terminal and the multimedia player.
The application also provides a loading system of the dynamic demonstration interface, which comprises: a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer program to run in units of the following system:
the image reading unit is used for starting a computer program to read the image required to be loaded by the user interface according to the interface layout parameters by the processor;
the interface assembly unit is used for carrying out interface assembly to form a first user interface by the processor according to the relative position information of each image on the user interface in the interface layout parameters and the size of the display interval;
the edge transition unit is used for carrying out edge transition on the first user interface to obtain a second user interface;
a size adapting unit for adapting the second user interface according to the size of the display screen;
and the display output unit is used for carrying out interface rendering and drawing the second user interface so as to display the second user interface in the display.
The beneficial effects of the present disclosure are: the application discloses a loading method of a dynamic demonstration interface, which does not need to accurately calculate and indicate the position offset of each picture resource in a UI interface, so that the edge connection of each image component on the user interface is more natural, and the tearing problem of the image edge of the user interface formed by dynamic splicing is effectively reduced; the method for splicing the images in a transitional way eliminates the phenomena of unnaturalness and the like at the splicing position, so that the image display completeness is high, the image distortion is small, the adaptability is good, and the user experience is improved.
Detailed Description
The conception, specific structure, and technical effects produced by the present disclosure will be clearly and completely described below in connection with the embodiments and the drawings to fully understand the objects, aspects, and effects of the present disclosure. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
A flowchart of a loading method of a dynamic presentation interface according to the present disclosure is shown in fig. 1, and a method according to an embodiment of the present disclosure is explained below in conjunction with fig. 1.
The present disclosure proposes a loading method of a dynamic presentation interface, the loading method depending on a terminal device, the terminal device comprising a display, a memory, a processor and a computer program stored in the memory and operable on the processor, the computer program comprising images and interface layout parameters to be loaded; the interface layout parameters are the relative position information of the image required by the user interface on the user interface, the size of the display interval and the switching state conversion of the user interface; the switching state of the user interfaces is converted into parameters from one user interface to another;
the loading method comprises the following steps:
s100: the processor starts a computer program to read an image to be loaded on the user interface according to the interface layout parameters;
s200: the processor performs interface assembly according to the relative position information of each image on the user interface in the interface layout parameters and the size of the display interval to form a first user interface;
s300: performing edge transition on the first user interface to obtain a second user interface;
s400: adapting the second user interface according to the size of the display screen;
s500: and performing interface rendering and drawing the second user interface so as to display the second user interface in the display.
Further, in S200, the method for assembling the interface into the first user interface by the processor according to the relative position information of each image on the user interface and the size of the display section in the interface layout parameters includes the following steps:
s201: the processor reads the image to be loaded;
s202: loading the images to be loaded into the size area of the display interval of the user interface according to the relative position information;
s203: a first user interface is obtained.
Further, in S300, the method for performing edge transition on the first user interface to obtain the second user interface includes the following steps:
the first user interface is denoted by I, phi is the first userThe already loaded image area in the interface, Ω is the blank area in the first user interface except for the boundary,for the boundary of each image according to the interface layout parameters in the first user interface, p is the pixel point with the largest transition value on the boundary, ψ p Is a rectangular domain centered on the p point, n p For p-point and->Orthogonal unit vectors;
s301: marking boundaries for the relative position information of the images required by the user interface on the user interface and the size of the display interval according to the interface layout parameters
S302: calculating the transition values of all pixel points on the boundary: for boundariesAny pixel point p on the display screen, and a rectangular domain ψ is determined by taking the p point as the center p The transition value P (P) of P point is calculated as follows: p (P) =c (P) ×d (P);
wherein, C (p) is a confidence, which is defined as follows:
|Ψ p the I is the total number of pixel points in the rectangular domain;
the initial values of C (p) are as follows: let C (p) =0 for all pixels in Ω, and let C (p) =1 for all pixels in Φ;
d (p) is a data item of a pixel, defined as:
wherein α is a pixel value; n is n p Is at the p point and boundaryOrthogonal unit vectors; />A gradient vector representing the point p, i.e. the direction in which the color change is greatest;
the data item of the pixel can carry out smooth transition along the gradient vector direction, because in the gradient vector direction, the data item of the pixel takes the maximum value, and the value of C (p) is the minimum near the point p as the center, so that the transition value is correspondingly small; so that noise of the first user interface is suppressed when smoothly transitioning along the gradient vector direction, so that texture of each image edge in the first user interface presents smooth continuity;
s303: constructing a transition block ψ with the pixel point with the largest transition value as the center p And scanning the matching module ψ in the already loaded image area Φ in the first user interface q : scan ψ p The most similar module ψ' q I.e. the module with the smallest euclidean distance, i.e.: psi' q =argmind(Ψ p ,Ψ q ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein the Euclidean distance isi. j are modules (ψ) p 、Ψ q ) A horizontal and vertical coordinate value of the middle pixel;
s304: updating the confidence level: c (p) =c (q')q'∈Ψ p ∩Ω,q'≠q;
S305: let ψ' q Copy of corresponding pixel points to ψ p In (a) and (b);
s306: iteratively executing steps S302 to S305 until all pixel points on the boundary have undergone edge transition to stop iteration; thus making an edge transition to the first user interface results in a second user interface.
Further, in S400, the method for adapting the second user interface according to the size of the display screen is: the scaled second user interface is the size of the display screen (screen width height size) of the terminal device.
Further, in S500, the method of performing the interface rendering and drawing the second user interface to display the second user interface on the display includes: and displaying and outputting a second user interface in any one display of the tablet personal computer, the mobile phone, the intelligent home, the navigation locator, the meal ordering terminal and the multimedia player.
The embodiment of the disclosure provides a loading system of a dynamic demonstration interface, as shown in fig. 2, which is a structural diagram of the loading system of the dynamic demonstration interface of the disclosure, where the loading system of the dynamic demonstration interface of the embodiment includes: a processor, a memory, and a computer program stored in the memory and executable on the processor, the processor implementing the steps in a loading system embodiment of a dynamic presentation interface as described above when the computer program is executed.
The system comprises: a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor executing the computer program to run in units of the following system:
the image reading unit is used for starting a computer program to read the image required to be loaded by the user interface according to the interface layout parameters by the processor;
the interface assembly unit is used for carrying out interface assembly to form a first user interface by the processor according to the relative position information of each image on the user interface in the interface layout parameters and the size of the display interval;
the edge transition unit is used for carrying out edge transition on the first user interface to obtain a second user interface;
a size adapting unit for adapting the second user interface according to the size of the display screen;
and the display output unit is used for carrying out interface rendering and drawing the second user interface so as to display the second user interface in the display.
The loading system of the dynamic demonstration interface can be operated in computing equipment such as a desktop computer, a notebook computer, a palm computer, a cloud server and the like. The system that the loading system of the dynamic demonstration interface can operate can comprise, but is not limited to, a processor and a memory. It will be appreciated by those skilled in the art that the examples are merely examples of a loading system for a dynamic presentation interface, and are not limiting of a loading system for a dynamic presentation interface, and may include more or fewer components than examples, or may combine certain components, or different components, e.g., a loading system for a dynamic presentation interface may further include input and output devices, network access devices, buses, etc. The processor may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. The general purpose processor may be a microprocessor or the processor may be any conventional processor, etc., where the processor is a control center of the loading system operating system of the dynamic presentation interface, and various interfaces and lines are used to connect various parts of the loading system operating system of the entire dynamic presentation interface.
The memory may be used to store the computer program and/or module, and the processor may implement various functions of the loading system of the dynamic presentation interface by running or executing the computer program and/or module stored in the memory and invoking data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as a hard disk, memory, plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card), at least one disk storage device, flash memory device, or other volatile solid-state storage device.
While the present disclosure has been described in considerable detail and with particularity with respect to several described embodiments, it is not intended to be limited to any such detail or embodiments or any particular embodiment, but is to be construed as providing broad interpretation of such claims by reference to the appended claims in view of the prior art so as to effectively encompass the intended scope of the disclosure. Furthermore, the foregoing description of the present disclosure has been presented in terms of embodiments foreseen by the inventor for the purpose of providing a enabling description for enabling the enabling description to be available, notwithstanding that insubstantial changes in the disclosure, not presently foreseen, may nonetheless represent equivalents thereto.