CN111309427A - Interface rendering method and device, computer readable storage medium and terminal equipment - Google Patents

Interface rendering method and device, computer readable storage medium and terminal equipment Download PDF

Info

Publication number
CN111309427A
CN111309427A CN202010112868.6A CN202010112868A CN111309427A CN 111309427 A CN111309427 A CN 111309427A CN 202010112868 A CN202010112868 A CN 202010112868A CN 111309427 A CN111309427 A CN 111309427A
Authority
CN
China
Prior art keywords
rendering
display interface
mirror image
language family
rendering result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010112868.6A
Other languages
Chinese (zh)
Inventor
刘继祖
陈凌锋
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ubtech Technology Co ltd
Original Assignee
Shenzhen Ubtech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ubtech Technology Co ltd filed Critical Shenzhen Ubtech Technology Co ltd
Priority to CN202010112868.6A priority Critical patent/CN111309427A/en
Publication of CN111309427A publication Critical patent/CN111309427A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/454Multi-language systems; Localisation; Internationalisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present application belongs to the field of computer technologies, and in particular, to an interface rendering method and apparatus, a computer-readable storage medium, and a terminal device. The method comprises the steps of obtaining language family categories in a preset configuration file; if the language family category is a preset first language family, rendering a display interface according to a preset reference rendering mode, wherein the first language family is a language family typeset according to a left-to-right sequence; and if the language family category is a preset second language family, rendering the display interface according to a preset mirror image rendering mode, wherein the second language family is a language family typeset from right to left. By the embodiment of the application, two sets of interfaces do not need to be manufactured to carry out interface adaptation, adaptation of different language family display rules can be realized by only using one set of interfaces through mirror image rendering, and therefore workload of interface development, maintenance, modification and testing is greatly reduced.

Description

Interface rendering method and device, computer readable storage medium and terminal equipment
Technical Field
The present application belongs to the field of computer technologies, and in particular, to an interface rendering method and apparatus, a computer-readable storage medium, and a terminal device.
Background
In the current languages, most languages represented by the Chinese language and English language are typeset from left to right, and other languages represented by the Arabic language are typeset from right to left. In the prior art, when each language family is adapted in a software system (including but not limited to Unity and non-real Engine), two sets of interfaces need to be manufactured because a support scheme is not provided by an official, and corresponding interfaces are selected for adaptation according to different language families. For large-scale application, the development amount and the maintenance cost are huge, and two sets of interfaces need to be modified every time the interfaces are changed. Meanwhile, huge workload is brought to the test, and the project development is not facilitated.
Disclosure of Invention
In view of this, embodiments of the present application provide an interface rendering method and apparatus, a computer-readable storage medium, and a terminal device, so as to solve the problem in the prior art that the workload of development, maintenance, modification, and testing is very large because two sets of interfaces need to be manufactured for interface adaptation.
A first aspect of an embodiment of the present application provides an interface rendering method, which may include:
obtaining language family categories in a preset configuration file;
if the language family category is a preset first language family, rendering a display interface according to a preset reference rendering mode, wherein the first language family is a language family typeset according to a left-to-right sequence;
and if the language family category is a preset second language family, rendering the display interface according to a preset mirror image rendering mode, wherein the second language family is a language family typeset from right to left.
Further, the rendering the display interface according to a preset mirror image rendering mode includes:
rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface;
carrying out integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface;
performing mirror image processing on each character in the second rendering result respectively to obtain a third rendering result of the display interface;
performing mirror image processing on each icon which does not represent the direction in the third rendering result to obtain a fourth rendering result of the display interface;
and reordering the digital sequences in the fourth rendering result respectively to obtain a fifth rendering result of the display interface.
Further, the performing integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface includes:
determining an initial position of a target pixel point in the first rendering result, wherein the target pixel point is any one pixel point in the display interface;
calculating the mirror image position of the target pixel point in the second rendering result according to the initial position and the display length of the display interface;
and displaying the target pixel point at the mirror image position.
Further, the calculating the mirror image position of the target pixel point in the second rendering result according to the initial position and the display length of the display interface includes:
calculating the mirror image position of the target pixel point in the second rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=Len-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of initial position, y1 is the ordinate of initial position, MirrorPos is the mirror image position, x2 is the abscissa of mirror image position, y2 is the ordinate of mirror image position, Len is the display length of display interface.
Further, the reordering the respective number sequences in the fourth rendering result to obtain a fifth rendering result of the display interface includes:
performing regular matching in the fourth rendering result by using a preset regular expression to obtain each digital sequence in the fourth rendering result;
and respectively carrying out reverse order processing on each digital sequence to obtain a fifth rendering result of the display interface.
A second aspect of an embodiment of the present application provides an interface rendering apparatus, which may include:
the language family category acquisition module is used for acquiring the language family categories in a preset configuration file;
the first rendering module is used for rendering the display interface according to a preset reference rendering mode if the language family category is a preset first language family, wherein the first language family is a language family typeset according to a left-to-right sequence;
and the second rendering module is used for rendering the display interface according to a preset mirror image rendering mode if the language family category is a preset second language family, and the second language family is a language family typesetting according to a sequence from right to left.
Further, the second rendering module may include:
the first processing unit is used for rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface;
the second processing unit is used for carrying out integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface;
the third processing unit is used for respectively carrying out mirror image processing on each character in the second rendering result to obtain a third rendering result of the display interface;
a fourth processing unit, configured to perform mirror image processing on each icon that does not represent a direction in the third rendering result, to obtain a fourth rendering result of the display interface;
and the fifth processing unit is used for reordering the digital sequences in the fourth rendering result respectively to obtain a fifth rendering result of the display interface.
Further, the second processing unit may include:
an initial position determining subunit, configured to determine an initial position of a target pixel point in the first rendering result, where the target pixel point is any one pixel point in the display interface;
a mirror image position calculating subunit, configured to calculate, according to the initial position and the display length of the display interface, a mirror image position of the target pixel point in the second rendering result;
and the mirror image position display subunit is used for displaying the target pixel point at the mirror image position.
Further, the mirror image position calculation subunit is specifically configured to calculate a mirror image position of the target pixel point in the second rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=Len-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of initial position, y1 is the ordinate of initial position, MirrorPos is the mirror image position, x2 is the abscissa of mirror image position, y2 is the ordinate of mirror image position, Len is the display length of display interface.
Further, the fifth processing unit may include:
the regular matching subunit is configured to perform regular matching in the fourth rendering result by using a preset regular expression to obtain each digital sequence in the fourth rendering result;
and the reverse order processing subunit is configured to perform reverse order processing on each digital sequence, respectively, to obtain a fifth rendering result of the display interface.
A third aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of any one of the above-mentioned interface rendering methods.
A fourth aspect of the embodiments of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of any one of the above interface rendering methods when executing the computer program.
A fifth aspect of the embodiments of the present application provides a computer program product, which, when running on a terminal device, causes the terminal device to execute the steps of any one of the interface rendering methods described above.
Compared with the prior art, the embodiment of the application has the advantages that: the method includes the steps that language family categories in preset configuration files are obtained; if the language family category is a preset first language family, rendering a display interface according to a preset reference rendering mode, wherein the first language family is a language family typeset according to a left-to-right sequence; and if the language family category is a preset second language family, rendering the display interface according to a preset mirror image rendering mode, wherein the second language family is a language family typeset from right to left. Through this application embodiment, need not to make two sets of interfaces and carry out the interface adaptation, through carrying out the mirror image and playing up, only use one set of interface can realize the adaptation of different language family display rules, only carry out the adaptation when playing up, interface developer need not to care about the adaptation rule of many language families promptly, be absorbed in and developed specific function can to interface development, maintenance, change and the work load of test have greatly been reduced.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flowchart of an embodiment of an interface rendering method according to an embodiment of the present application;
FIG. 2 is a diagram illustrating results of rendering a display interface according to a reference rendering manner;
FIG. 3 is a diagram illustrating results of rendering a display interface in a mirror rendering manner;
FIG. 4 is a schematic view of a process for rendering the display interface in a mirror rendering manner;
FIG. 5 is a schematic diagram of a first rendering result of a display interface;
FIG. 6 is a diagram illustrating a second rendering result of the interface;
FIG. 7 is a diagram illustrating a third rendering result of the interface;
FIG. 8 is a diagram illustrating a fourth rendering result of the interface;
FIG. 9 is a diagram illustrating a fifth rendering result of the display interface;
fig. 10 is a block diagram of an embodiment of an interface rendering apparatus according to an embodiment of the present application;
fig. 11 is a schematic block diagram of a terminal device in an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, an embodiment of an interface rendering method in an embodiment of the present application may include:
step S101, obtaining language family categories in a preset configuration file.
In the embodiment of the application, a user can select a required language family category through a human-computer interaction interface of the terminal device, and the selection result is stored in a configuration file of the terminal device. When an interface display is needed, the terminal device may first read the language family category from the configuration file, and then determine the language family category, if the language family category is a preset first language family, execute step S102, and if the language family category is a preset second language family, execute step S103.
The first language family is a language family typeset according to a left-to-right sequence, and includes but is not limited to a Chinese language family, an English language family and the like; the second language family is a language family typeset from right to left, and includes but is not limited to arabic language family and the like.
And S102, rendering the display interface according to a preset reference rendering mode.
The reference rendering manner may include: typesetting the whole layout according to the sequence from left to right; the characters are left aligned; typesetting the numbers according to the sequence from left to right; icons that do not represent directions are laid out in a default manner. Fig. 2 is a schematic diagram illustrating a result of rendering the display interface according to the reference rendering manner.
And S103, rendering the display interface according to a preset mirror image rendering mode.
The mirror image rendering mode may include: typesetting the whole layout according to the sequence from right to left, namely forming a mirror image relation with the reference rendering mode in a left-right opposite way; the characters are aligned on the right, namely, the characters are in mirror image relation with the left and right opposite to the reference rendering mode; the numbers are typeset according to the sequence from left to right and are consistent with the reference rendering mode; and the icons which do not represent the directions are typeset according to a default mode and are consistent with the reference rendering mode. Fig. 3 is a schematic diagram illustrating a result of rendering a display interface according to the mirror image rendering manner.
It should be noted that the reference rendering manner is a default rendering manner, and the mirror rendering manner needs to be performed based on the reference rendering manner, as shown in fig. 4, step S103 may specifically include the following processes:
and step S1031, rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface.
Fig. 5 is a schematic diagram illustrating a first rendering result of the display interface.
Step S1032, integral mirror image processing is carried out on the first rendering result, and a second rendering result of the display interface is obtained.
Taking any one pixel point in the display interface as an example for explanation, taking this pixel point as a target pixel point, firstly determining an initial position of the target pixel point in the first rendering result, and then calculating a mirror image position of the target pixel point in the second rendering result according to the initial position and the display length of the display interface, specifically, calculating a mirror image position of the target pixel point in the second rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=Len-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of initial position, y1 is the ordinate of initial position, MirrorPos is the mirror image position, x2 is the abscissa of mirror image position, y2 is the ordinate of mirror image position, Len is the display length of display interface.
It should be noted that, in the embodiment of the present application, a plane where a display interface is located is taken as a coordinate plane, a horizontal direction of characters in the display interface is taken as a horizontal axis direction, and a vertical direction of characters in the display interface is taken as a vertical axis direction.
After the mirror image position is obtained, the target pixel point can be displayed at the mirror image position.
Traversing all the pixel points in the display interface through the process, and obtaining a second rendering result as shown in fig. 6.
Step S1033, performing mirror image processing on each character in the second rendering result, respectively, to obtain a third rendering result of the display interface.
Taking any character in the display interface as an example for explanation, taking the character as a target character, selecting any pixel point from an area where the target character is located as a target pixel point, firstly determining an initial position of the target pixel point in the second rendering result, and then calculating a mirror image position of the target pixel point in the third rendering result according to the initial position, a left boundary abscissa of the target character and a display length of the target character, specifically, calculating a mirror image position of the target pixel point in the third rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=2*xL+CLen-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of the initial position, y1 is the ordinate of the initial position, MirrorPos is the mirror image position, x2 is the abscissa of the mirror image position, y2 is the ordinate of the mirror image position, xL is the left boundary abscissa of the target character, and CLen is the display length of the target character.
After the mirror image position is obtained, the target pixel point can be displayed at the mirror image position.
And traversing all pixel points in the target character to obtain a mirror image result of the target character, and performing the mirror image processing on each character in the second rendering result to obtain a third rendering result as shown in fig. 7.
Step S1034 is to perform mirror image processing on each icon not representing a direction in the third rendering result, so as to obtain a fourth rendering result of the display interface.
Taking any icon in the display interface, which does not represent a direction, as an example, taking the icon as a target icon, selecting any pixel point from a region where the target icon is located as a target pixel point, firstly determining an initial position of the target pixel point in the third rendering result, and then calculating a mirror image position of the target pixel point in the fourth rendering result according to the initial position, a left boundary abscissa of the target icon, and a display length of the target icon, specifically, calculating a mirror image position of the target pixel point in the fourth rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=2*xL+CLen-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of the initial position, y1 is the ordinate of the initial position, MirrorPos is the mirror position, x2 is the abscissa of the mirror position, y2 is the ordinate of the mirror position, xL is the left boundary abscissa of the target icon, and CLen is the display length of the target icon.
After the mirror image position is obtained, the target pixel point can be displayed at the mirror image position.
Traversing all pixel points in the target icon through the above process to obtain a mirror image result of the target icon, and performing the mirror image processing on each icon which does not represent a direction in the third rendering result to obtain a fourth rendering result as shown in fig. 8.
Step S1035, re-ordering each number sequence in the fourth rendering result, respectively, to obtain a fifth rendering result of the display interface.
Specifically, a preset regular expression may be first used to perform regular matching in the fourth rendering result to obtain each digital sequence in the fourth rendering result, and then each digital sequence is respectively subjected to reverse order processing to obtain a fifth rendering result as shown in fig. 9, where the fifth rendering result is a final result of rendering the display interface according to the mirror image rendering manner.
To sum up, the embodiment of the present application obtains the language family category in the preset configuration file; if the language family category is a preset first language family, rendering a display interface according to a preset reference rendering mode, wherein the first language family is a language family typeset according to a left-to-right sequence; and if the language family category is a preset second language family, rendering the display interface according to a preset mirror image rendering mode, wherein the second language family is a language family typeset from right to left. Through this application embodiment, need not to make two sets of interfaces and carry out the interface adaptation, through carrying out the mirror image and playing up, only use one set of interface can realize the adaptation of different language family display rules, only carry out the adaptation when playing up, interface developer need not to care about the adaptation rule of many language families promptly, be absorbed in and developed specific function can to interface development, maintenance, change and the work load of test have greatly been reduced.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 10 shows a structure diagram of an embodiment of an interface rendering apparatus according to an embodiment of the present application, corresponding to an interface rendering method according to the foregoing embodiment.
In this embodiment, an interface rendering apparatus may include:
a language family category obtaining module 1001, configured to obtain a language family category in a preset configuration file;
a first rendering module 1002, configured to render the display interface according to a preset reference rendering manner if the language family category is a preset first language family, where the first language family is a language family typeset according to a left-to-right sequence;
a second rendering module 1003, configured to, if the language family category is a preset second language family, render the display interface according to a preset mirror rendering manner, where the second language family is a language family typeset according to a sequence from right to left.
Further, the second rendering module may include:
the first processing unit is used for rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface;
the second processing unit is used for carrying out integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface;
the third processing unit is used for respectively carrying out mirror image processing on each character in the second rendering result to obtain a third rendering result of the display interface;
a fourth processing unit, configured to perform mirror image processing on each icon that does not represent a direction in the third rendering result, to obtain a fourth rendering result of the display interface;
and the fifth processing unit is used for reordering the digital sequences in the fourth rendering result respectively to obtain a fifth rendering result of the display interface.
Further, the second processing unit may include:
an initial position determining subunit, configured to determine an initial position of a target pixel point in the first rendering result, where the target pixel point is any one pixel point in the display interface;
a mirror image position calculating subunit, configured to calculate, according to the initial position and the display length of the display interface, a mirror image position of the target pixel point in the second rendering result;
and the mirror image position display subunit is used for displaying the target pixel point at the mirror image position.
Further, the mirror image position calculation subunit is specifically configured to calculate a mirror image position of the target pixel point in the second rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=Len-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of initial position, y1 is the ordinate of initial position, MirrorPos is the mirror image position, x2 is the abscissa of mirror image position, y2 is the ordinate of mirror image position, Len is the display length of display interface.
Further, the fifth processing unit may include:
the regular matching subunit is configured to perform regular matching in the fourth rendering result by using a preset regular expression to obtain each digital sequence in the fourth rendering result;
and the reverse order processing subunit is configured to perform reverse order processing on each digital sequence, respectively, to obtain a fifth rendering result of the display interface.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, modules and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Fig. 11 shows a schematic block diagram of a terminal device provided in an embodiment of the present application, and only shows a part related to the embodiment of the present application for convenience of description.
As shown in fig. 11, the terminal device 11 of this embodiment includes: a processor 110, a memory 111 and a computer program 112 stored in said memory 111 and executable on said processor 110. The processor 110 executes the computer program 112 to implement the steps in the above-mentioned interface rendering method embodiments, such as the steps S101 to S103 shown in fig. 1. Alternatively, the processor 110, when executing the computer program 112, implements the functions of each module/unit in each device embodiment, for example, the functions of the modules 1001 to 1003 shown in fig. 10.
Illustratively, the computer program 112 may be partitioned into one or more modules/units that are stored in the memory 111 and executed by the processor 110 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program 112 in the terminal device 11.
The terminal device 11 may be a computing device such as a smart phone, a tablet computer, a smart watch/bracelet, and smart glasses. Those skilled in the art will appreciate that fig. 11 is only an example of the terminal device 11, and does not constitute a limitation to the terminal device 11, and may include more or less components than those shown, or combine some components, or different components, for example, the terminal device 11 may further include an input-output device, a network access device, a bus, etc.
The Processor 110 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 111 may be an internal storage unit of the terminal device 11, such as a hard disk or a memory of the terminal device 11. The memory 111 may also be an external storage device of the terminal device 11, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 11. Further, the memory 111 may also include both an internal storage unit and an external storage device of the terminal device 11. The memory 111 is used for storing the computer program and other programs and data required by the terminal device 11. The memory 111 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An interface rendering method, comprising:
obtaining language family categories in a preset configuration file;
if the language family category is a preset first language family, rendering a display interface according to a preset reference rendering mode, wherein the first language family is a language family typeset according to a left-to-right sequence;
and if the language family category is a preset second language family, rendering the display interface according to a preset mirror image rendering mode, wherein the second language family is a language family typeset from right to left.
2. The interface rendering method according to claim 1, wherein the rendering the display interface in a preset mirror image rendering manner comprises:
rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface;
carrying out integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface;
performing mirror image processing on each character in the second rendering result respectively to obtain a third rendering result of the display interface;
performing mirror image processing on each icon which does not represent the direction in the third rendering result to obtain a fourth rendering result of the display interface;
and reordering the digital sequences in the fourth rendering result respectively to obtain a fifth rendering result of the display interface.
3. The interface rendering method according to claim 2, wherein the performing the integral mirroring on the first rendering result to obtain the second rendering result of the display interface includes:
determining an initial position of a target pixel point in the first rendering result, wherein the target pixel point is any one pixel point in the display interface;
calculating the mirror image position of the target pixel point in the second rendering result according to the initial position and the display length of the display interface;
and displaying the target pixel point at the mirror image position.
4. The interface rendering method of claim 3, wherein the calculating the mirror position of the target pixel point in the second rendering result according to the initial position and the display length of the display interface comprises:
calculating the mirror image position of the target pixel point in the second rendering result according to the following formula:
MirrorPos=(x2,y2)
OriginPos=(x1,y1)
x2=Len-x1
y2=y1
wherein OriginPos is the initial position, x1 is the abscissa of initial position, y1 is the ordinate of initial position, MirrorPos is the mirror image position, x2 is the abscissa of mirror image position, y2 is the ordinate of mirror image position, Len is the display length of display interface.
5. The interface rendering method according to any one of claims 2 to 4, wherein the reordering the respective numerical sequences in the fourth rendering result to obtain a fifth rendering result of the display interface comprises:
performing regular matching in the fourth rendering result by using a preset regular expression to obtain each digital sequence in the fourth rendering result;
and respectively carrying out reverse order processing on each digital sequence to obtain a fifth rendering result of the display interface.
6. An interface rendering apparatus, comprising:
the language family category acquisition module is used for acquiring the language family categories in a preset configuration file;
the first rendering module is used for rendering the display interface according to a preset reference rendering mode if the language family category is a preset first language family, wherein the first language family is a language family typeset according to a left-to-right sequence;
and the second rendering module is used for rendering the display interface according to a preset mirror image rendering mode if the language family category is a preset second language family, and the second language family is a language family typesetting according to a sequence from right to left.
7. The interface rendering apparatus of claim 6, wherein the second rendering module comprises:
the first processing unit is used for rendering the display interface according to the reference rendering mode to obtain a first rendering result of the display interface;
the second processing unit is used for carrying out integral mirror image processing on the first rendering result to obtain a second rendering result of the display interface;
the third processing unit is used for respectively carrying out mirror image processing on each character in the second rendering result to obtain a third rendering result of the display interface;
a fourth processing unit, configured to perform mirror image processing on each icon that does not represent a direction in the third rendering result, to obtain a fourth rendering result of the display interface;
and the fifth processing unit is used for reordering the digital sequences in the fourth rendering result respectively to obtain a fifth rendering result of the display interface.
8. The interface rendering apparatus of claim 7, wherein the second processing unit comprises:
an initial position determining subunit, configured to determine an initial position of a target pixel point in the first rendering result, where the target pixel point is any one pixel point in the display interface;
a mirror image position calculating subunit, configured to calculate, according to the initial position and the display length of the display interface, a mirror image position of the target pixel point in the second rendering result;
and the mirror image position display subunit is used for displaying the target pixel point at the mirror image position.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the interface rendering method according to any one of claims 1 to 5.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the interface rendering method according to any one of claims 1 to 5 when executing the computer program.
CN202010112868.6A 2020-02-24 2020-02-24 Interface rendering method and device, computer readable storage medium and terminal equipment Pending CN111309427A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010112868.6A CN111309427A (en) 2020-02-24 2020-02-24 Interface rendering method and device, computer readable storage medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010112868.6A CN111309427A (en) 2020-02-24 2020-02-24 Interface rendering method and device, computer readable storage medium and terminal equipment

Publications (1)

Publication Number Publication Date
CN111309427A true CN111309427A (en) 2020-06-19

Family

ID=71149215

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010112868.6A Pending CN111309427A (en) 2020-02-24 2020-02-24 Interface rendering method and device, computer readable storage medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN111309427A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020143825A1 (en) * 2001-03-27 2002-10-03 Microsoft Corporation Ensuring proper rendering order of bidirectionally rendered text
US20040225967A1 (en) * 2000-10-30 2004-11-11 Microsoft Corporation System and method for user interface mirroring utilizing a layout manager
US20080317347A1 (en) * 2007-06-20 2008-12-25 Chee Boon Lim Rendering engine test system
US7506255B1 (en) * 2004-02-17 2009-03-17 Microsoft Corporation Display of text in a multi-lingual environment
CN103024583A (en) * 2012-12-26 2013-04-03 新奥特(北京)视频技术有限公司 Data display method and data display device
CN104106079A (en) * 2011-09-09 2014-10-15 帕那莫夫公司 Image processing system and method
US20160378514A1 (en) * 2015-06-24 2016-12-29 International Business Machines Corporation Automated testing of gui mirroring
CN109413131A (en) * 2018-04-28 2019-03-01 武汉思普崚技术有限公司 A kind of method and device of log parsing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040225967A1 (en) * 2000-10-30 2004-11-11 Microsoft Corporation System and method for user interface mirroring utilizing a layout manager
US20020143825A1 (en) * 2001-03-27 2002-10-03 Microsoft Corporation Ensuring proper rendering order of bidirectionally rendered text
US7506255B1 (en) * 2004-02-17 2009-03-17 Microsoft Corporation Display of text in a multi-lingual environment
US20080317347A1 (en) * 2007-06-20 2008-12-25 Chee Boon Lim Rendering engine test system
CN104106079A (en) * 2011-09-09 2014-10-15 帕那莫夫公司 Image processing system and method
CN103024583A (en) * 2012-12-26 2013-04-03 新奥特(北京)视频技术有限公司 Data display method and data display device
US20160378514A1 (en) * 2015-06-24 2016-12-29 International Business Machines Corporation Automated testing of gui mirroring
CN109413131A (en) * 2018-04-28 2019-03-01 武汉思普崚技术有限公司 A kind of method and device of log parsing

Similar Documents

Publication Publication Date Title
CN107451244B (en) Folder naming method, mobile terminal and computer readable storage medium
CN109933264B (en) Graphic data display method and device
CN112965645B (en) Page dragging method and device, computer equipment and storage medium
CN112181386B (en) Code construction method, device and terminal based on software continuous integration
CN114116441A (en) UI (user interface) testing method and device, electronic equipment and storage medium
CN111782758A (en) Drawing review result viewing method based on CAD and related device
CN113192639A (en) Training method, device and equipment of information prediction model and storage medium
CN109358927B (en) Application program display method and device and terminal equipment
CN108521460B (en) Information pushing method and device, mobile terminal and computer readable storage medium
CN107103010A (en) Visualize the processing method and processing device of data
CN112596725A (en) Grading method and grading device for programming works, terminal equipment and storage medium
CN112506503A (en) Programming method, device, terminal equipment and storage medium
CN109451347A (en) A kind of special effect making method, apparatus, terminal and computer readable storage medium
CN111309427A (en) Interface rendering method and device, computer readable storage medium and terminal equipment
CN111161789A (en) Analysis method and device for key region of model prediction
CN111931794B (en) Sketch-based image matching method
CN115438129A (en) Structured data classification method and device and terminal equipment
CN112506976B (en) Data flow display method and device, electronic equipment and storage medium
CN110018828B (en) Source code checking method and device and terminal equipment
CN111506185B (en) Method, device, electronic equipment and storage medium for operating document
CN108268347B (en) Physical equipment performance testing method and device
CN111079771A (en) Method and system for extracting characteristics of click-to-read image, terminal device and storage medium
CN108595569B (en) File path copying method, file path copying device and mobile terminal
CN111258695A (en) Dial plate recommendation method and device for telephone watch and terminal equipment
CN112001988A (en) Animation effect generation method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200619