CN115145436B - Icon processing method and electronic equipment - Google Patents

Icon processing method and electronic equipment Download PDF

Info

Publication number
CN115145436B
CN115145436B CN202110346370.0A CN202110346370A CN115145436B CN 115145436 B CN115145436 B CN 115145436B CN 202110346370 A CN202110346370 A CN 202110346370A CN 115145436 B CN115145436 B CN 115145436B
Authority
CN
China
Prior art keywords
icon
vector diagram
text
word weight
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110346370.0A
Other languages
Chinese (zh)
Other versions
CN115145436A (en
Inventor
刘爱兵
罗义
陈翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110346370.0A priority Critical patent/CN115145436B/en
Publication of CN115145436A publication Critical patent/CN115145436A/en
Application granted granted Critical
Publication of CN115145436B publication Critical patent/CN115145436B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an icon processing method and electronic equipment, which are used for solving the problem of display look and feel split caused by inconsistent thickness degree of icon lines and thickness degree of text fonts in a display interface. The method comprises the step that when the electronic equipment detects that the font thickness of the current text is the first font weight, the line of the first icon correspondingly displays the first thickness degree. The displayed line is a first icon with a first thickness degree, and the electronic equipment generates the first icon according to the first character weight, the first icon with the thinnest line and the first icon with the thickest line. The displayed line of the first icon presents a positive correlation of the font thickness of the current text. By implementing the method, the system can generate richer changes based on different word weights, the display effect of the icons is more flexible and attractive, the display effect of fonts and the display effect of the icons are more unified and harmonious, the interface display effect is improved, and the user's visual experience is better.

Description

Icon processing method and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an icon processing method and an electronic device.
Background
In the running process of the terminal equipment such as the mobile phone, the computer and the like, various elements such as characters, backgrounds, icons and the like are displayed on a display interface of the terminal equipment, wherein the icons can be function identification controls such as homepage icons, return icons, bluetooth icons and the like which indicate corresponding functions.
Generally, a user may set some elements displayed on the display interface according to a user requirement, so that the elements may present different display effects, such as personalized settings for font size, font thickness, font type, font color, and the like. The same icon is often displayed in a fixed image or shape, not supporting a variable icon. Especially when characters and icons are adjacent, the fonts have only one display effect under the adjustable condition, so that the display of the icons is not attractive and flexible enough, and the user experience is poor.
Disclosure of Invention
The application provides an icon processing method and related electronic equipment, which are used for solving the problem of display impression split caused by inconsistent thickness degree of icon lines and thickness degree of text fonts in a display interface of the electronic equipment.
The above and other objects are achieved by the features of the independent claims. Further implementations are presented in the dependent claims, the description and the figures.
In a first aspect, an embodiment of the present application provides an icon processing method, where the method may include:
The electronic device displays text, wherein the word weight of the text is a first word weight, and the word weight characterizes the degree of the font thickness. The electronic device displays a first vector diagram of the first icon, wherein a line of the first icon presents a first thickness degree in the first vector diagram, and the thickness degree presented by the line of the first icon in the vector diagram is positively correlated with the thickness degree of the font characterized by the character weight.
The electronic device detects a first operation to change the word weight. The electronic device displays text with a word weight of the text being the second word weight. The electronic device displays a second vector image of the first icon, the lines of the first icon presenting a second degree of thickness in the second vector image.
If the second word weight is less than the first word weight, the second degree of thickness is less than the first degree of thickness. If the second word weight is greater than the first word weight, the second thickness degree is greater than the first thickness degree.
By implementing the method of the first aspect, the electronic equipment can automatically generate the changing shapes with different thickness degrees on the basis of the same line outline by supporting the icons based on different word weights, the thickness degrees of fonts and the icons are more consistent, the display effects are more uniform and more harmonious, the interface display effect is improved, and the user's impression experience is better.
In combination with the first aspect, in some embodiments, the first icon is displayed as a first vector image when the word weight of the text is the first word weight. When the word weight of the text is the second word weight, the first icon is displayed as a second vector diagram.
In combination with the first aspect, in some embodiments, the first vector diagram is obtained by the electronic device according to the first word weight, the third vector diagram of the first icon, and the fourth vector diagram of the first icon, where a degree of thickness of the line of the first icon in the first vector diagram is greater than or equal to a degree of thickness of the line of the first icon in the third vector diagram, and less than or equal to a degree of thickness of the line of the first icon in the fourth vector diagram. The second vector diagram is obtained by the electronic equipment according to the second word weight, the third vector diagram of the first icon and the fourth vector diagram of the first icon, wherein the thickness degree of the lines of the first icon in the second vector diagram is larger than or equal to the thickness degree of the lines of the first icon in the third vector diagram, and smaller than or equal to the thickness degree of the lines of the first icon in the fourth vector diagram.
In combination with the first aspect, in some embodiments, the word weight of the text includes a third word weight and a fourth word weight, and the degree of font thickness represented by the first word weight or the second word weight is greater than or equal to the degree of thickness represented by the third word weight and less than or equal to the degree of thickness represented by the fourth word weight.
In combination with the first aspect, in some embodiments, the first icon is displayed as a third vector image when the word weight of the text is a third word weight. When the word weight of the text is the fourth word weight, the first icon is displayed as a fourth vector diagram.
With reference to the first aspect, in some embodiments, the first vector diagram includes a first path, the second vector diagram includes a second path, the third vector diagram includes a third path, the fourth vector diagram includes a fourth path, the first path, the second path, the third path, and the fourth path correspond to a same line in the first icon, the first path is calculated by the electronic device according to the first word weight, the third path, and the fourth path, and the second path is calculated by the electronic device according to the second word weight, the third path, and the fourth path.
In combination with the first aspect, in some embodiments, the electronic device displays a user interface, where the user interface includes a status bar, a first vector diagram of a first text for indicating a mobile operator and a first icon for indicating a wireless communication signal strength are displayed in the status bar, a word weight of the first text is a first word weight, and a line of the first icon presents a first thickness degree in the first vector diagram. After the electronic equipment detects the first operation of changing the word weight, the electronic equipment displays another user interface, wherein the user interface comprises a status bar, a second text used for indicating a mobile operator and a second vector diagram of a first icon used for indicating the strength of a wireless communication signal are displayed in the status bar, the word weight of the second text is the second word weight, and a line of the first icon presents a second thickness degree in the second vector diagram.
In a second aspect, an embodiment of the present application provides an electronic device, which may include: the system comprises a communication device, a display screen, a memory and a processor coupled to the memory, a plurality of application programs, and one or more programs. The memory has stored therein computer executable instructions that when executed by the processor cause the electronic device to perform any of the functions provided by the electronic device as in the method of the first aspect.
In a third aspect, embodiments of the present application provide a computer storage medium having stored therein a computer program comprising executable instructions which, when executed by a processor, cause the processor to perform operations corresponding to the method provided in the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product for, when run on an electronic device, causing the electronic device to perform any one of the possible implementations as in the first aspect.
In a fifth aspect, embodiments of the present application provide a chip system applicable to an electronic device, the chip comprising one or more processors for invoking computer instructions to cause the electronic device to implement any of the possible implementations as in the first aspect.
By implementing the method provided by the application, the electronic equipment can automatically generate the changing shapes with different thickness degrees on the basis of the same line outline on the basis of different character weights, the display effect of the icons is more flexible and attractive, the display effect of fonts and icons is more uniform and harmonious, the interface display effect is improved, better visual display is achieved, the browsing habit of a user is more met, and the viewing experience of the user is further improved.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a software architecture according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a technical scheme provided by the embodiment of the application;
FIG. 4 is a schematic diagram of an icon provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a user interface provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a user interface provided by an embodiment of the present application;
Fig. 10 is a flowchart of an icon processing method according to an embodiment of the present application.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in addition, in the description of embodiments of the application, "a plurality" means two or more.
The term "User Interface (UI)" in the following embodiments of the present application is a media interface for interaction and information exchange between an Application (APP) or an Operating System (OS) and a user, which implements conversion between an internal form of information and a form acceptable to the user. The user interface is a source code written in a specific computer language such as java, extensible markup language (extensible markup language, XML) and the like, and the interface source code is analyzed and rendered on the electronic equipment to finally be presented as content which can be identified by a user. A commonly used presentation form of a user interface is a graphical user interface (graphic user interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be a visual interface element of text, icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, widgets, etc., displayed in a display of the electronic device.
The term "font (typeface)", in embodiments of the present application, refers to a set of one or more fonts (font), each of which is composed of fonts (glyph) having common design features, each of which contains common design elements.
A glyph refers to the shape of a single word (letter, chinese, symbol, number, etc.).
A font refers to a set of fonts having the same style, size (size), etc., as "number 12 conventional songbase". Each font of the font may have a specific font height (height), font weight (weight), font width (width), style (style), inclination (slope), italics (italicization), decoration (ornamentation), and the like. Wherein, word height represents the height degree of the font, word weight represents the thickness degree of the font, and word width represents the width degree of the font.
Font design types may include dot/bitmap, vector/vector, etc.
The variable fonts refer to fonts containing a plurality of fonts, and support automatic generation of rich variable shapes on the basis of the initial font outline, so that a user can freely adjust the appearance of the fonts in one or more dimensions, such as the weight, the height and the width of the characters, and the like, and can realize stepless adjustment in each dimension, and the variable effects of the plurality of dimensions can be overlapped, so that the fonts have more free variable spaces, different requirements of different users on the display effects can be met, and browsing habits of different users are met.
In one example, for a conventional standard font library, the more common font word recombination approach may be light, thin, standard regular, sub-thick semi-bold, thick bold, and exist as six independent libraries. If each file organizes fonts according to the GB18030-2000 standard, the font file size corresponding to each font weight is more than 8MB, and the total font file size corresponding to 6 font weights is close to 50MB. However, a variable font supporting a change in the font weight can provide 800 font weights with only one font file, and the file size is about 20MB in the case of conforming to the GB18030-2000 encoding standard. Thus, the variable fonts not only have a change space far exceeding the regular fonts, but also have a file size far smaller than that of the regular fonts. Variable fonts allow more font variation than conventional fonts while saving more memory.
The term icon in the embodiment of the application refers to a mark with indication function, and has the characteristics of high concentration, quick information transmission, convenient identification and convenient memory. The character display may be assisted. Icons are typically presented in an image or graphic form in the display interface of an electronic device, may be presented as pictures, graphics, or other objects, have a standard set of size and attribute formats, and are typically small in size. The display effect of the icon may be changed by changing the color, saturation, transparency, etc. of the icon.
The icon is helpful for a user to quickly identify an object, quickly access the object or quickly execute a command, the icon can be used for indicating a file, a program, a state, a webpage or a command, and the like, for example, for a Bluetooth (blue) icon in a toolbar of a mobile phone, the user can quickly identify that the Bluetooth icon corresponds to an indication Bluetooth function when seeing the Bluetooth icon, and the user can click the Bluetooth icon to execute an operation for quickly opening or closing the Bluetooth function. The icon attribute types may include color, size, transparency, shadow, and the like, and the parameters of the corresponding icon attribute types are color values (such as RGB), icon size values, transparency values, shadow effect identifiers, and the like, respectively.
Icon design types may include pixel (pixel) icons, vector (vector) icons, and the like.
The pixel icon, which may be called a bitmap (bitmap) icon or a dot matrix (dotmatrix) icon, uses a bitmap/bitmap as an icon, and the bitmap is a pixel array image formed by arranging a plurality of pixel points, so that the color gradation is rich, and the scaling is distorted. The object that is operable when performing an edit operation on the dot matrix map is each pixel. The storage format of the bitmap may be an image interchange format (graphicsinterchangeformat, GIF), a portable network image (portablenetworkgraphics, PNG) format, or the like.
The vector icon, which may be called a vector icon, is an image generated by drawing a mathematical vector using a vector diagram as an icon, and is an image in which the graphic content is drawn by using geometric characteristics such as points, lines, curves, polygons, and circles, and has a characteristic of no distortion in scaling. The storage format of the vector graphics may be a scalable vector graphics (scalablevectorgraphics, SVG) format, or the like. SVG is a markup language that describes two-dimensional vector graphics based on XML.
The vector icon may be composed of one or more elements, and the basic shape elements predefined in the SVG standard include: rectangular rect, circular circle, elliptical elipse, straight line, polyline polyline, polygonal polygon, path, etc.
To name a few examples:
rectangle rect, the length and width (width) of the rectangle, the corner radius length (rx, ry) define the rectangle by specifying the upper left corner coordinates (x, y).
A straight line is defined by specifying a start point (x 1, y 1), an end point (x 2, y 2), and a width stroke, and the syntax is exemplified as follows: < line x 1= "127" y1= "65" x2= "127" y2= "200" style= "stroke: rgb (0, 0); stroke-width:2"/>. If a very thin line is desired, the stroke-width may be assigned a value greater than 0 and less than 1.
The circle is defined as a circle, the coordinates (cx, cy) of the center of the circle and the radius r are specified, and the grammar is exemplified as follows :<circle cx="143"cy="163"r="84"style="fill:rgb(192,192,255);stroke:rgb(0,0,128);stroke-width:1"/>.
Ellipse, defining an ellipse, like defining a circle, specifies the center of the circle (cx, cy), and the X, Y-axis radius (rx, ry) of the ellipse, the syntax is exemplified as follows :<ellipse cx="160"cy="163"rx="101"ry="81"style="fill:rgb(192,192,255);stroke:rgb(0,0,128);stroke-width:1"/>.
Polyline polyline, which is defined by specifying the coordinates of each point, is connected together to define a polyline, the syntax being exemplified as follows :<polyline points="100,200 100,20 10,200 100,20"style="stroke:rgb(64,64,64);stroke-width:1"/>.
Polygonal polygon defines a polygon by specifying coordinates of successive points which are eventually closed to form a polygon. Syntax is exemplified as follows :<polygon points="250,250 297,284 279,340"style="fill:rgb(126,14,83);stroke:rgb(0,0,128);stroke-width:1"/>.
Path, which defines a more complex shape, can be a closed or non-closed geometry, is the most complex in SVG and is the most useful and most used drawing command. The definition of a path node may include several small paths, each of which represents a geometry, such as a line, curve, etc., and the path segments defined in a path node are substantially independent of each other. The syntax is exemplified as follows: < path d= "m10 20L110 20L110 120 L10 120"style = "fill: rgb (0, 22) "/>. The d attribute of the < path > tag is used to describe path data to be defined below, M10 20 indicates that the brush is moved to a point 10, 20, L110 20 indicates that a line is drawn from the current point to coordinates 110, 20, etc.
Several commands commonly used in path include: m command, (Mx, y) indicating that the brush is moved to a specified coordinate position (x, y); l command, (L x, y) representing drawing a straight line to a specified coordinate position (x, y); h command, (Hx) representing drawing a horizontal line to a specified x coordinate position; v command, (V y) representing drawing a vertical line to a specified y-coordinate position; c command, (C x, y1, x2, y2, endx, endy), cubic bezier curve, (x 1, y 1), (x 2, y 2) is two control points of the curve, (endx, endy) is the end point of the curve; q command, (Q x, y, endx, endy), quadratic bezier curve, (x, y) control point of curve, (endx, endy) end point of curve; z order, closing path, connecting end point and start point; etc.
As another example of a three-dimensional bezier curve, < path d= "M50,70C50,20 200,20 200 70" >, a curve is drawn from the start point (50, 70) to the point (200,70), and (50, 20), (200, 70) is two control points. The first control point (50, 20) controls the angle between the starting point of the curve and the horizontal line, called the starting control point, which is actually determined by the coordinates of the starting point defined by M and the angle between the horizontal line and the straight line drawn by the coordinates points defined by the first control point, which can be regarded as a tangent at the starting point of the curve. Also, the direction at the end of the curve is determined by the angle formed by the straight line (which can be seen as a tangent at the end of the curve) and the horizontal line drawn by the end coordinates (200, 70) and the second control point (200, 20).
Since the path marker can combine multiple commands, more complex shapes can be generated.
In addition, style sheet attributes may also be used to define the style of the appearance, such as fill color fill, tracing color store, tracing thickness store-width, transparency opacity, fill color transparency fill-opacity, stroke color transparency stroke-opacity, and so forth, which are not described in detail herein.
On terminal equipment such as mobile phones and computers, application scenes with variable fonts are gradually enriched, and a user can set the fonts displayed on a display interface on the terminal according to requirements, adjust the size, thickness, type and the like of the fonts, so that the fonts can display different display effects. The same icon is often displayed in a fixed image or shape, not supporting a variable icon. Especially, on the interface where the characters and the icons are adjacent, if the icons have only one display effect, the icons are not attractive and flexible, and the user experience is poor.
To address the above, a variable icon is implemented, in one implementation, a iconFont solution may be used that may call the icon in text form, supporting adjustment of the icon size, but not the icon thickness. In another implementation, a solution of SF Symbol can be used, which stores 27 variants in total, i.e. three size categories for each icon, each size category being further divided into 9 thickness categories, which can only change icons among the 9 thickness categories, which cannot be adjusted steplessly, and which requires storing data of 27 variants for each icon, which takes up a relatively large space.
The embodiment of the application provides an icon processing method and electronic equipment, which are used for solving the problem of display impression splitting caused by inconsistent thickness degree of icon lines and thickness degree of text fonts in a display interface of the electronic equipment. According to the method provided by the application, when the electronic equipment detects that the font thickness of the current text is the first font weight, the line of the first icon correspondingly displays the first thickness degree. The displayed lines are first icons with first thickness degree, and the electronic equipment generates the first icons according to the first word weight, the vector diagram data of the first icons with the thinnest lines and the vector diagram data of the first icons with the thickest lines. The displayed line of the first icon presents a positive correlation of the font thickness of the current text.
By implementing the scheme provided by the embodiment of the application, the system can automatically generate the variable shapes with different thickness degrees on the basis of the same line outline by supporting the icons based on different word weights, the display effect of the icons is more flexible and attractive, the display effect of the fonts and the icons can be more uniform and harmonious, the interface display effect is improved, better visual display is achieved, the browsing habit of a user is more met, and the viewing experience of the user is further improved.
The structure of the electronic device provided in the embodiment of the present application is described below as an example.
Exemplary electronic devices 100 provided by embodiments of the present application may include, but are not limited to, cell phones, notebook computers, desktop computers, tablet computers, or other types of electronic devices, such as desktop computers, laptop computers, handheld computers, ultra-mobile personal computer (UMPC), netbooks, cellular phones, personal Digital Assistants (PDA), augmented reality (augmented reality, AR) devices, virtual Reality (VR) devices, artificial intelligence (ARTIFICIAL INTELLIGENCE, AI) devices, smart screens, vehicle-mounted devices, game consoles, smartwatches, smartbracelets, or other smart wearable devices, etc., as well as internet of things (internet of things, IOT) devices and/or smart home devices and/or smart city devices, such as smart water heaters, smart lamps, smart air conditioners, smart speakers, etc. The embodiment of the present application does not impose any limitation on the specific type of the electronic device 100. In this embodiment, the terminal device may also be simply referred to as a terminal, and the terminal device is typically an intelligent electronic device that may provide a user interface, interact with a user, and provide a service function for the user.
The electronic device 100 can be mounted thereonSystem,/>System,/>System,/>A system (HOS) or other type of operating system, to which the present application is not limited.
Fig. 1 is a schematic hardware structure of an electronic device 100 according to an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SERIAL DATA LINE, SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (CAMERA SERIAL INTERFACE, CSI), display serial interfaces (DISPLAY SERIAL INTERFACE, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation SATELLITE SYSTEM, GNSS), frequency modulation (frequency modulation, FM), near field communication (NEAR FIELD communication, NFC), infrared (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques can include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (GENERAL PACKET radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation SATELLITE SYSTEM, GLONASS), a beidou satellite navigation system (beidou navigation SATELLITE SYSTEM, BDS), a quasi zenith satellite system (quasi-zenith SATELLITE SYSTEM, QZSS) and/or a satellite based augmentation system (SATELLITE BASED AUGMENTATION SYSTEMS, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory has the characteristics of high reading/writing speed and volatility. Volatile refers to the fact that, once powered down, the data stored in RAM will disappear. In general, the static power consumption of the random access memory is extremely low, and the operation power consumption is relatively large.
The nonvolatile memory has characteristics of nonvolatile and stable storage data. The non-volatile means that the stored data will not disappear after power failure, and the data can be stored after power failure for a long time.
The random access memory may include static random-access memory (SRAM), dynamic random-access memory (dynamic random access memory, DRAM), synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, e.g., fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.
The nonvolatile memory may include a disk storage device (MAGNETIC DISK storage), a flash memory (flash memory), or the like.
The magnetic disk storage device is a memory using a magnetic disk as a storage medium, and has the characteristics of large storage capacity, high data transmission rate, long-term storage of stored data and the like.
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC-LEVEL CELL), multi-level memory cells (multi-LEVEL CELL, MLC), triple-level memory cells (LEVEL CELL, TLC), quad-LEVEL CELL, QLC), etc. divided according to a memory cell potential order, may include general FLASH memory (english: universal FLASH storage, UFS), embedded multimedia memory card (eMMC) MEDIA CARD, eMMC), etc. divided according to a memory specification.
The random access memory may be read directly from and written to by the processor 110, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 110 to directly read and write.
The external memory interface 120 may be used to connect external non-volatile memory to enable expansion of the memory capabilities of the electronic device 100. The external nonvolatile memory communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. Embodiments of the invention are configured in a layered mannerThe system is an example illustrating the software architecture of the electronic device 100.
Fig. 2 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, it is possible toThe system is divided into four layers, namely an application program layer, an application program framework layer and An Zhuoyun lines (/ >) from top to bottomRuntime) and system libraries, and a kernel layer.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications for cameras, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, settings, etc. Wherein, the size, thickness and the like of the fonts can be set in the setting application.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. Such data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
Run time includes a core library and virtual machines. /(I)Runtime is responsible for scheduling and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated below in connection with capturing a photo scene.
When touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into the original input event (including information such as touch coordinates, time stamp of touch operation, etc.). The original input event is stored at the kernel layer. The application framework layer acquires an original input event from the kernel layer, and identifies a control corresponding to the input event. Taking the touch operation as a touch click operation, taking a control corresponding to the click operation as an example of a control of a camera application icon, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera driver by calling a kernel layer, and captures a still image or video by the camera 193.
The above description of the software architecture of the electronic device 100 is merely an example, and it should be understood that the software architecture illustrated in the embodiments of the present application is not limited to the specific embodiments of the present application. In other embodiments of the application, the software architecture of the electronic device 100 may include more or fewer modules than shown, or may combine certain modules, or split certain modules, or may be arranged in a different architecture. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The following describes a technical implementation scheme provided by the embodiment of the application.
The embodiment is carried byThe electronic device 100 of the system is illustrated as an example. It should be understood that the present embodiment is not limited to other embodiments of the present application, and in other embodiments of the present application, the electronic device 100 may also have other operating systems mounted thereon, which may vary in implementation. However, based on the same inventive concept, only different embodiments should be considered as falling within the scope of the present application. The flow of steps of this scheme is referred to in fig. 3.
Step one: the electronic device 100 parses the vector image data with the narrowest line and the vector image data with the thickest line of the first icon.
In the present embodiment, vector graphics of two kinds of thickness degree can be designed for each icon. Here, a first icon is described as an example. The electronic device 100 may store two vector graphics that the developer/designer designs for the first icon, one with thinner lines and the other with thicker lines. The electronic device 100 regards this finer line vector diagram as the vector diagram corresponding to the first icon with the finest line, hereinafter abbreviated as the finest vector diagram, and this coarser line vector diagram as the vector diagram corresponding to the first icon with the coarsest line, hereinafter abbreviated as the coarsest vector diagram.
The file format of the finest and coarsest vector diagrams may be SVG format. The SVG file production tools may include Adobe Illustrator software, etc. Reference may be made to the foregoing description for SVG, and no further description is given here.
SVG is a set of syntax specifications used by the front-end, data described for vectors. At the position ofIn the system, in order to improve the loading efficiency of vector diagram data, path data in SVG grammar needs to be extracted, then data reconstruction is carried out, and the path data is converted into vector format to generate specific labels.
Since for vector data, the vector diagram can be changed and scaled undistorted as long as the vector data is effectively modified. Therefore, in this embodiment, the vector data can be modified based on the data of the finest vector image and the coarsest vector image, and an icon having a line thickness degree between the finest line and the coarsest line represented by the finest vector image and the coarsest vector image can be generated.
In one example, as shown in fig. 4, for the add icon, (a) is the vector diagram with the narrowest line and (b) is the vector diagram with the thickest line. The data corresponding to the two vector diagrams can be placed in the same SVG file, for example, the data with the id of "ic_add_thin" can be named as the vector diagram with the narrowest line, and the data with the id of "ic_add_bold" can be named as the vector diagram with the thickest line.
Example code one is as follows:
The SVG file is imported After the system,/>The system can extract the path data in the SVG file, reconstruct the data, convert the data into a vector format, and add different name attributes to each path, thereby facilitating the system identification.
In one example, for the two paths included in the vector diagram with the narrowest line in the above example code one, the name attribute may be added separately: HUAWEI_ VARIATION _ICON_START_PATH_0, HUAWEI_ VARIATION _ICON_START_PATH_1; the name attribute is also added to the two paths included in the vector diagram with the thickest line: huawei_ VARIATION _icon_end_path_0, huawei_ VARIATION _icon_end_path_1.
Meanwhile, an additional path node is added in the vector and used for bearing vector icon data corresponding to the current word weight obtained through subsequent calculation. The number of the additionally added paths is the same as the number of the paths corresponding to the thinnest/coarsest vector diagram, and the name attribute is respectively as follows: HUAWEI_ VARIATION _ICON_CURRENT_PATH_0, HUAWEI_ VARIATION _ICON_CURRENT_PATH_1.
It should be noted that the color of the path data of the finest vector image and the coarsest vector image may be set to be transparent, and the color of the path data of the vector icon corresponding to the current word weight is opaque, so that when the file is loaded, the icons corresponding to the finest and coarsest lines are not displayed, but only the vector icon obtained after calculation according to the current word weight is displayed.
Example code two is as follows:
/>
/>
In the example code, path data in svg is extracted and placed in vector pathData. The same path corresponding to different vector data represents the same line in the icon, and the thickness degree of the line is different due to different path data. If "HUAWEI_VARIATION_ICON_START_PATH_0"、"HUAWEI_VARIATION_ICON_END_PATH_0"、"HUAWEI_VARIATION_ICON_CURRENT_PATH_0", is the path data corresponding to different thickness of the same line, the curve types of the paths are the same, and the values of the data points on the paths are different.
The electronic device 100 may analyze the vector image data with the narrowest line and the vector image data with the coarsest line of the first icon in the vector pathData, so as to facilitate subsequent calculation of the vector icon data under the current word weight.
Step two: the electronic device 100 detects that the font size of the current text is the first font weight.
In some embodiments, the system or application may automatically set the font weight of the variable font in the current display interface, or the user may manually set the font weight parameter value in the font settings in the system or application.
The font weight characterizes the thickness of the font, which can be represented numerically. In some embodiments, the parameter value of the word weight may be any value greater than or equal to 0 and less than or equal to 1. For example, the system developer may set the font weight parameter value of the thinnest font to 0, the font weight parameter value of the coarsest font to 1, and the minimum variation to 0.1, and then the user may adjust the font weight parameter value in the system setting independently, with an adjustment range of 0 to 1, for example, the parameter value for adjusting the font weight to 0.5, with a larger value indicating a thicker font and a finer degree approaching the coarsest font.
In the embodiment of the application, the representation form and the value range of the parameter value of the font weight are not limited, for example, in another example, the system developer can display the value range of the set parameter value of the font weight as 0-100, the parameter value of the font weight of the thinnest font as 0, the parameter value of the font weight of the thickest font as 100 and the minimum variation as 1, so that more accurate font weight adjustment can be realized.
For the convenience of subsequent calculation, the parameter value of each word weight can be converted into a parameter value which is more than or equal to 0 and less than or equal to 1 according to linear proportion, and the first word weight corresponds to the first parameter, regardless of the parameter value of the word weight. For example, in one example, the word weight parameter value is set in the range of 0-100, the user sets the word weight parameter value 65, the maximum parameter value 100 is set to 0.65, and then the first parameter of the word weight may be set to 0.65. In another example, the system developer may also display the value range of the set parameter value of the word weight as-50 to 50, and similarly, the value between 0 and 1 may also be scaled to represent the first parameter of the word weight, which is not described herein.
Thus, after the parameter value representing the first font weight is set by the user, the system or the application, the electronic device 100 may obtain the first parameter representing the font thickness, where the value of the first parameter ranges from 0 to 1, and the font thickness represented by the first parameter is between the finest font and the coarsest font designed by the system developer. In some embodiments, the larger the weight of the font, the larger the first parameter value, the thicker the font, the closer to the coarsest font, and the smaller the weight of the font, the smaller the first parameter value, the finer the font, the closer to the finest font. If the first parameter is equal to 0, the font of the display text is the finest, and if the first parameter is equal to 1, the font of the display text is the coarsest.
Step three: the electronic device 100 calculates the first vector diagram data of the current first icon according to the first word weight, the vector diagram data with the thinnest line and the vector diagram data with the thickest line through an interpolation calculation method.
Considering the visual experience of the user and harmony and beautiful appearance of the display interface, the thickness degree of the fonts and the thickness degree of the icon lines can be set to be in one-to-one correspondence.
In some embodiments, the line of the first icon presents a positive correlation in the vector diagram of the degree of thickness of the font characterized by the weight of the word. For example, when the word weight of the text is the minimum word weight, the first icon is displayed as the vector diagram with the thinnest line; when the word weight of the text is the maximum word weight, the first icon is displayed as a vector diagram with the thickest lines. When the word weight of the text is adjusted to be larger than the minimum word weight and smaller than the maximum word weight, the thickness degree of the lines in the displayed first icon is thicker than the thickness degree of the lines in the line finest vector diagram and thinner than the thickness degree of the lines in the line finest vector diagram.
In some embodiments, the electronic device 100 may adaptively adjust the thickness of the line of the displayed first icon based on the first parameter of the first word weight. For example, the electronic device 100 may calculate, based on the first parameter, the line finest vector image data and the line coarsest vector image data of the first icon, by using a linear interpolation method, to obtain current vector image data of the first icon, that is, first vector image data corresponding to the first word weight.
Linear interpolation is a simple interpolation method widely used in the fields of mathematics, computer graphics, etc., and is usually to determine an unknown between two known quantities connecting straight lines.
If the coordinates A (x 1, y 1) and B (x 2, y 2) are known, the value of a certain position C (x, y) on the straight line connected with the point A and the point B can be obtained according to the interpolation coefficient w. The interpolation coefficient is a value representing a proportion.
In this embodiment, the interpolation coefficient may be a first parameter converted by the first word weight, where the first parameter is greater than or equal to 0 and less than or equal to 1.
It can be seen that, in the path data analyzed by the line narrowest vector diagram and the line thickest vector diagram, for the same line, the curve types are the same, and the data points for drawing the line paths are different. Therefore, the linear interpolation calculation is carried out on the correspondence of each data point of the same line drawn in the line narrowest vector diagram and the line thickest vector diagram, and the first vector diagram data of the first icon under the first character weight corresponding to the first parameter can be obtained. Interpolation calculations may be performed between each corresponding data point, e.g., between the start point and the start point, between each data point of the Bezier curve, between the end points, etc.
The calculation formula is exemplified as follows:
x=x1+ (x 2-x 1) weight, y=y1+ (y 2-y 1) weight, where weight represents the current word weight, 0.ltoreq.weight.ltoreq.1. (x, y) is the coordinate point corresponding to the current first word weight, (x 1, y 1) is the data point corresponding to the thinnest icon, and (x 2, y 2) is the data point corresponding to the thickest icon.
In one example, weight is 0.5, and based on the data in the example code above, pathData = "M39,44C38,4538,4536.5,45.5 36,45L36.5,45" of the "huawei_ VARIATION _icon_current_path_0" PATH and pathData = "M59,31L79.5,48.5C80.5,49.5 81,51.5 81.5,52" of the "huawei_ VARIATION _icon_current_path_1" PATH can be calculated.
The linear interpolation calculation is merely illustrative, and is not limited to the linear interpolation method, and other methods may be used to calculate the first vector image data, which is not limited in any way in the embodiment of the present application.
Step four: the electronic device 100 reads the first vector diagram data and displays a first icon.
After obtaining the first vector graphics data for the first icon, the electronic device 100 may generate an icon resource file that may be read for display, e.g., as inIn the system, a vector type file may be generated. The first vector graphics data may include not only line paths, but also rendering parameters such as line width, filling pattern, and coloring pattern, which may be consistent for different images displayed by the first icon, with only the line thickness of the icon being changed.
The electronic device 100 may generate a drawing or rendering command after reading the first vector diagram data of the first icon, draw or render the first icon through a CPU and/or GPU, and send the drawing or rendering command to a display screen to display the first icon, where a line displayed by the first icon presents a first thickness degree. And moreover, the effect that the thickness degree of the lines displayed by the first icon is consistent with the thickness degree of the text can be displayed, and the visual experience is more attractive.
The thickness degree of the lines of the first icon in the first vector diagram is larger than or equal to the thickness degree of the lines of the first icon in the line thinnest vector diagram, and smaller than or equal to the thickness degree of the lines of the first icon in the line thickest vector diagram.
In one example, the word weight is set to 0.5, and the line thickness of the displayed first icon is the target icon of intermediate thickness between the thinnest line and the thickest line, as shown in fig. 4 (c).
It should be noted that, the examples of fig. 3 and fig. 4 described herein are only used to assist in describing the technical solutions provided by the embodiments of the present application, and are not limited to the embodiments of the present application.
In connection with the above description, the following exemplary illustrates some of the user interfaces that may be displayed on the electronic device 100 in order to more fully illustrate the application.
Fig. 5 shows an example user interface of the electronic device 100 with respect to setting fonts.
As shown in fig. 5 (a), in the setting interface 500, one or more setting items, such as a fly-mode setting item, a Wi-Fi setting item, a bluetooth setting item, a personal hot spot setting item, a mobile network setting item, a font setting item 501, a display and brightness setting item, a warrior account setting item, and the like, may be included.
Entering the font settings item 501 may display a detailed font settings interface 502 as shown in fig. 5 (b). In the font settings interface 502, a font demonstration area 503, a font size adjustment control 504, a font thickness adjustment control 505, and the like may be included. Wherein the font demonstration area 503 is used to demonstrate the effect of font adjustment. The user may adjust the font size and thickness via font size adjustment control 504 and font thickness adjustment control 505. In the example interface of fig. 5 (b), the adjustment buttons in the font size adjustment control 504 may be controlled to slide left and right, the smaller the font, the larger the font. Likewise, the control buttons in the font thickness control 505 can be controlled to slide left and right, with the font being thinner and thicker as it goes to the left. In addition to adjusting the font size and thickness, font properties such as font spacing, width, height, and the like may also be adjusted, which are not shown in the interface of the present embodiment, but this is not a limitation of the present embodiment.
According to the scheme in the embodiment of the application, after the user adjusts the character weight, namely the font thickness degree, the line thickness degree of the icon in the user interface is changed. The line thickness of the icon is positively correlated with the font thickness. Some exemplary user interfaces are shown below to illustrate the effects that embodiments of the present application may achieve.
As shown in fig. 6, the font is thickened to some extent by pulling the adjustment button in the font thickness adjustment control 505 to the right. In some embodiments, the font thickness level may also be represented by a parameter value, the larger the value the thicker the font. In fig. 6, assuming that the font thickness adjustment range is 0 to 100, we can adjust the thickness degree 50 shown in fig. 6 (a) to the thickness degree 80 shown in fig. 6 (b), and the font becomes thicker in the system. The font adjustment may be applied to a system application, or may be applied to a third party application supporting variable fonts, which is not limited in this embodiment.
After the font is thickened, as can be seen from comparing (a) and (b) in fig. 6, the text fonts in the font setting interface 502 are thickened, and the thickness degree is consistent, including the text displayed in the font demonstration area 503, the font size adjusting control 504 and the font thickness adjusting control 505.
After the electronic device 100 detects the change of the font thickness, the electronic device 100 can adaptively adjust the line thickness of the icon at the same time. Based on the character weight adjustment shown in fig. 6, the change of the icon line thickness is described below with a lock screen interface, a setup interface, and a pull-down toolbar interface as examples. Assuming that the font shown in fig. 6 (a) is a first font weight and the font shown in fig. 6 (b) is a second font weight, in the case shown in fig. 6 (a) and (b), the second font weight is larger than the first font weight and the font becomes thicker, the line of the icon is correspondingly changed from the first thickness degree to the second thickness degree. In the case where the second word is heavier than the first word, the second thickness is greater than the first thickness.
Fig. 7 (a) and (b) show a lock screen interface, where the lock screen interface 701 shown in fig. 7 (a) corresponds to the user interface under the first character weight in fig. 6 (a), and the lock screen interface 703 shown in fig. 7 (b) corresponds to the user interface under the second character weight in fig. 6 (b).
Specifically, the lock screen interface 701 includes time, date, etc. indication information, a sliding unlock control, and a top status bar 702, where the top status bar 702 includes indication information of a mobile operator, a mobile signal strength indicator, a wireless network signal strength indicator, a battery status indicator, a time indicator, etc.
It can be seen that the font of the text in the lock screen interface 703 is thicker than the font of the text in the lock screen interface 701, and the line thickness of each icon in the lock screen interface 703 is thicker than the line thickness of each corresponding icon in the lock screen interface 701. The text of the indication information such as time, date, sliding unlocking control and the like included in the screen locking interface 703, the text of the indication information of the mobile operator included in the top status bar 702, the text of the time indicator and the like, and the font thickness of the text is thicker than the text correspondingly displayed in the screen locking interface 701 and the top status bar 702. Meanwhile, the icons of the sliding unlocking control and the like included in the screen locking interface 703, and the icons of the moving signal intensity indicator, the wireless network signal intensity indicator, the battery state indicator and the like included in the top status bar 704 have a larger line thickness than the icons correspondingly displayed in the screen locking interface 701 and the top status bar 702.
Fig. 8 (a) and (b) show system setting interfaces, and the setting interface 801 shown in fig. 8 (a) corresponds to the user interface under the first word weight in fig. 6 (a), and the setting interface 805 shown in fig. 8 (b) corresponds to the user interface under the second word weight in fig. 6 (b).
Specifically, the setting interface 801 includes one or more setting items, such as an airplane mode setting item, a Wi-Fi setting item, a bluetooth setting item, a personal hot spot setting item, a mobile network setting item, a font setting item, a display and brightness setting item, a bloom account setting item, and the like. Each setting item includes an indication icon, an indication text, and a setting control. The left column in the setting interface 801 is an indication icon 803 of a setting item, the right column is a corresponding setting control icon 804, and the middle is a setting item indication text. Also included in fig. 8 (a) is a top status bar 802, the top status bar 802 including information indicative of the mobile operator, a mobile signal strength indicator, a wireless network signal strength indicator, a battery status indicator, a time indicator, etc.
It can be seen that the font of the text in the setting interface 805 is thicker than the font of the text in the setting interface 801, and the line thickness of each icon in the setting interface 805 is thicker than the line thickness of each corresponding icon in the setting interface 801. In fig. 8 (a) and (b), the text of the indication of each setting item included in the setting interface 805, the text of the indication information of the mobile operator, the text of the time indicator, and the like included in the top status bar 806 are thicker in font thickness than the text correspondingly displayed in the setting interface 801 and the top status bar 802. Meanwhile, the indication icons 807 of the respective setting items included in the setting interface 805, the corresponding setting control icons 808, and the icons of the mobile signal strength indicator, the wireless network signal strength indicator, the battery status indicator, and the like included in the top status bar 806 are thicker in line thickness than the indication icons 803 of the respective setting items in the setting interface 801, the corresponding setting control icons 804, and the icons correspondingly displayed in the top status bar 802.
Fig. 9 (a), (b) show drop-down toolbar interfaces, the interface shown in fig. 9 (a) corresponding to the user interface under the first word weight in fig. 6 (a), and the interface shown in fig. 9 (b) corresponding to the user interface under the second word weight in fig. 6 (b).
Specifically, the drop-down toolbar interface 901 includes one or more shortcut setting items, such as WLAN, auto-rotation, flashlight, bluetooth, flight mode, mobile data, location, screen capturing, hot spot, screen recording, large screen projection, NFC, etc., and the user can quickly turn on or off the corresponding function by clicking on the shortcut setting item icon in the drop-down toolbar. Each shortcut setting item comprises an indication icon and an indication text. Also included in the drop-down toolbar interface 901 are time indicators, date indicators, brightness adjustment controls, and the like. The user interface in fig. 9 (a) further includes a top status bar 902, where the top status bar 902 includes information indicative of the mobile operator, a mobile signal strength indicator, a wireless network signal strength indicator, a battery status indicator, a time indicator, and the like. The user interface in fig. 9 (a) further includes a Tab column 903 at the bottom for indicating icons of commonly used applications, and in the example of fig. 9 (a), the Tab column includes phone, address book, browser, searched application icons and names.
It can be seen that the font of the text in the user interface shown in fig. 9 (b) is thicker than the font of the text in the user interface shown in fig. 9 (a), and the line thickness of each icon in fig. 9 (b) is thicker than the line thickness of each corresponding icon in fig. 9 (a). In the interfaces shown in fig. 9 (a) and (b), the shortcut setting items, the time, and the indication text of the date included in the drop-down toolbar interface 904, the indication information text of the mobile operator, the time indicator text, and the like included in the top status bar 905, and the APP icon names included in the bottom Tab 906 have the text with a font thickness thicker than the text correspondingly displayed in the drop-down toolbar interface 901, the top status bar 902, and the bottom Tab 903. Meanwhile, the icons of the shortcut setting items included in the drop-down toolbar interface 904, the APP indication icons in the bottom Tab 906, and the icons of the mobile signal strength indicator, the wireless network signal strength indicator, the battery status indicator, and the like included in the top status bar 905 are thicker in line thickness than the corresponding shortcut setting items in the toolbar interface 901, the APP indication icons in the bottom Tab 903, and the corresponding displayed icons in the top status bar 902.
The above description of user interfaces is illustrative only and is not intended to limit other embodiments of the present application. In other embodiments, the user interface may include more or fewer elements. It will be appreciated that the method provided by the application may be applied to other interfaces not shown based on the same inventive concept. The type of the variable icon is not limited to the system icon, and may be a third party APP icon.
The following describes an icon processing method provided by the embodiment of the application. The method may be applied to the electronic device 100. The examples provided by this embodiment do not set any limit to other embodiments of the application.
Fig. 10 is a flowchart of an icon processing method according to an embodiment of the present application, which specifically includes the following steps:
S101, the electronic device 100 displays a text, and the word weight of the text is the first word weight.
In some embodiments, the system or application may automatically set the font weight of the variable font in the current display interface, or the user may manually set the font weight parameter value in the font settings in the system or application.
The font weight characterizes the thickness of the font, which can be represented numerically. In some embodiments, the parameter value of the word weight may be any value greater than or equal to 0 and less than or equal to 1. For example, the system developer may set the font weight parameter value of the thinnest font to 0, the font weight parameter value of the coarsest font to 1, and the minimum variation to 0.1, and then the user may adjust the font weight parameter value in the system setting independently, with an adjustment range of 0 to 1, for example, the parameter value for adjusting the font weight to 0.5, with a larger value indicating a thicker font and a finer degree approaching the coarsest font.
In the embodiment of the application, the representation form and the value range of the parameter value of the font weight are not limited, for example, in another example, the system developer can display the value range of the set parameter value of the font weight as 0-100, the parameter value of the font weight of the thinnest font as 0, the parameter value of the font weight of the thickest font as 100 and the minimum variation as 1, so that more accurate font weight adjustment can be realized.
For the convenience of subsequent calculation, the parameter value of each word weight can be converted into a parameter value which is more than or equal to 0 and less than or equal to 1 according to linear proportion, and the first word weight corresponds to the first parameter, regardless of the parameter value of the word weight. For example, in one example, the word weight parameter value is set in the range of 0-100, the user sets the word weight parameter value 65, the maximum parameter value 100 is set to 0.65, and then the first parameter of the word weight may be set to 0.65. In another example, the system developer may also display the value range of the set parameter value of the word weight as-50 to 50, and similarly, the value between 0 and 1 may also be scaled to represent the first parameter of the word weight, which is not described herein.
Thus, after the parameter value representing the first font weight is set by the user, the system or the application, the electronic device 100 may obtain the first parameter representing the font thickness, where the value of the first parameter ranges from 0 to 1, and the font thickness represented by the first parameter is between the finest font and the coarsest font designed by the system developer. In some embodiments, the larger the weight of the font, the larger the first parameter value, the thicker the font, the closer to the coarsest font, and the smaller the weight of the font, the smaller the first parameter value, the finer the font, the closer to the finest font. If the first parameter is equal to 0, the font of the display text is the finest, and if the first parameter is equal to 1, the font of the display text is the coarsest.
S102, the electronic device 100 generates first vector diagram data of the first icon according to the first parameter of the first font weight, the third vector diagram data of the first icon and the fourth vector diagram data of the first icon.
The first vector diagram, the third vector diagram and the fourth vector diagram are different vector diagram images of the first icon. The vector image is an image based on xml, does not provide specific pixels, only provides drawing instructions, has the advantages of very small memory occupation, high performance, arbitrary scaling without distortion and the disadvantage of no bitmap expression and rich colors. For the vector diagrams and bitmaps, reference may be made to the foregoing description, and no further description is given here.
In the embodiment of the application, for the first icon indicating the same function or meaning, a plurality of images, namely different graphic styles, can exist, and the thickness degree of the lines of the first icon in each image is different. In this embodiment, for the same word weight, the first icon displays a graphic style corresponding to a thickness level.
In some embodiments, the third vector image is the image with the narrowest line in the first icon and the fourth vector image is the image with the thickest line in the first icon. The third and fourth vector diagrams may be provided by a developer/designer, the two vector diagrams being similar in pattern, having the same lines, indicating the same contours, but different in thickness, the lines described being the thinnest in the third vector diagram and the thickest in the fourth vector diagram.
Considering the visual experience of the user and harmony and beautiful appearance of the display interface, the thickness degree of the fonts and the thickness degree of the icon lines can be set to be in one-to-one correspondence.
In some embodiments, the line of the first icon presents a positive correlation in the vector diagram of the degree of thickness of the font characterized by the weight of the word. For example, when the word weight of the text is the third word weight, the first icon is displayed as a third vector diagram, i.e. the image with the thinnest line; when the word weight of the text is the fourth word weight, the first icon is displayed as a fourth vector image, i.e. the image with the thickest line. The third character weight is the minimum character weight, namely the thinnest character weight, and the fourth character weight is the maximum character weight, namely the thickest character weight. When the word weight of the adjusted text is greater than the third word weight and less than the fourth word weight, the line thickness in the displayed first icon is greater than the line thickness in the third vector diagram and less than the line thickness in the fourth vector diagram.
In some embodiments, the electronic device 100 may adaptively adjust the thickness of the line of the displayed first icon based on the first parameter of the first word weight. For example, the electronic device 100 may calculate the first vector image data of the first icon through a linear interpolation method based on the first parameter, and the third vector image data of the first icon and the fourth vector image data of the first icon.
In particular, each vector diagram may be composed of various elements, such as straight lines, curves, circles, polygons, etc., consisting essentially of various types of line paths. Each vector graphics data may include one or more path paths indicating lines of the icon, each path being described by a line type and a plurality of data points.
For example, the first vector diagram includes a first path, the third vector diagram includes a third path, the fourth vector diagram includes a fourth path, the first path, the third path and the fourth path correspond to the same line in the first icon, the line types are the same, and the first path is calculated by the electronic device according to the first word weight. For example, each data point for describing the first path may be obtained by performing linear interpolation calculation on each corresponding data point on the third path and the fourth path based on the first parameter. For an example of a specific implementation, reference may be made to the foregoing description, which is not repeated here.
S103, the electronic device 100 reads the first vector diagram data, displays a first icon, and the line of the first icon presents a first thickness degree.
The thickness degree of the lines of the first icons in the first vector diagram is larger than or equal to that of the lines of the first icons in the third vector diagram, and smaller than or equal to that of the lines of the first icons in the fourth vector diagram.
The electronic device 100 may generate a drawing or rendering command after reading the first vector diagram data of the first icon, draw or render the first icon through a CPU and/or GPU, and send the drawing or rendering command to a display screen to display the first icon, where a line displayed by the first icon presents a first thickness degree.
S104, the electronic device 100 detects a first operation of changing the word weight, which causes the text to be changed from the first word weight to the second word weight.
In some embodiments, the user may manually adjust the text word weight within the system settings, the user may also adjust the text word weight in the shortcut settings column, and the system may also automatically adjust the text word weight. The present application is not limited in any way with respect to the specific manner of the first operation.
S105, the electronic device 100 displays the text, and the word weight of the text is the second word weight.
The second word weight corresponds to a second parameter. The second word weight may be described with reference to the first word weight, and will not be described here.
S106, the electronic device 100 generates second vector diagram data of the first icon according to the second parameter of the second font weight, the third vector diagram data of the first icon and the fourth vector diagram data of the first icon.
In some embodiments, the electronic device 100 may adaptively adjust the thickness of the line of the displayed first icon based on the second parameter of the second word weight. For example, the electronic device 100 may calculate the second vector image data of the first icon by a linear interpolation method based on the second parameter, and the third vector image data of the first icon and the fourth vector image data of the first icon.
For example, the second vector diagram includes a second path, the third vector diagram includes a third path, the fourth vector diagram includes a fourth path, the second path, the third path and the fourth path correspond to the same line in the first icon, the line types are the same, and the second path is calculated by the electronic device according to the second word weight. For example, each data point for describing the second path may be obtained by performing linear interpolation calculation on each corresponding data point on the third path and the fourth path based on the second parameter. For an example of a specific implementation, reference may be made to the foregoing description, which is not repeated here.
S107, the electronic device 100 reads the second vector diagram data, displays the first icon, and the line of the first icon presents the second thickness degree.
The thickness degree of the lines of the first icon in the second vector diagram is larger than or equal to that of the lines of the first icon in the third vector diagram, and smaller than or equal to that of the lines of the first icon in the fourth vector diagram.
The electronic device 100 may generate a drawing command after reading the second vector image data of the first icon, draw or render the first icon through a CPU and/or GPU, and send the drawing command to a display screen to display the first icon, where a line displayed by the first icon presents a second thickness degree.
In some embodiments, the line of the first icon presents a positive correlation in the vector diagram of the degree of thickness of the font characterized by the weight of the word. Then if the second word weight is less than the first word weight, the second degree of thickness is less than the first degree of thickness; if the second word weight is greater than the first word weight, the second thickness degree is greater than the first thickness degree.
It is to be understood that the processing of the icon may not only adjust the thickness of the line in the icon based on the word weight, but also adjust the width, size, height, etc. of the icon based on the same inventive concept according to the width, size, height, etc. of the font. The above embodiments are described by way of example only and are not intended to limit other embodiments of the application.
By implementing the method provided by the embodiment, the icons displayed by the electronic equipment have richer changes, the display effect of the icons is more flexible and attractive, the display effects of fonts and the icons are unified, the interface display effect is improved, the better visual display effect is achieved, the browsing habit of a user is more met, and the impression experience of the user is improved.
The implementations described in the above embodiments are merely illustrative and do not limit the other embodiments of the present application in any way. The specific internal implementation may be different according to different types of electronic devices, different operating systems carried on the electronic devices, different programs used and different interfaces called, and the embodiment of the present application is not limited in any way, and may implement the feature functions described in the embodiment of the present application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to determination …" or "in response to detection …" depending on the context. Similarly, the phrase "at the time of determination …" or "if detected (a stated condition or event)" may be interpreted to mean "if determined …" or "in response to determination …" or "at the time of detection (a stated condition or event)" or "in response to detection (a stated condition or event)" depending on the context.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
Those of ordinary skill in the art will appreciate that implementing all or part of the above-described method embodiments may be accomplished by a computer program to instruct related hardware, the program may be stored in a computer readable storage medium, and the program may include the above-described method embodiments when executed. And the aforementioned storage medium includes: ROM or random access memory RAM, magnetic or optical disk, etc.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.

Claims (12)

1. An icon processing method, characterized in that the method comprises:
the electronic equipment displays a text, wherein the character weight of the text is a first character weight, and the character weight represents the degree of the thickness of the font;
The electronic equipment displays a first vector diagram of a first icon, wherein a line of the first icon presents a first thickness degree in the first vector diagram, and the thickness degree presented by the line of the first icon in the vector diagram is positively correlated with the thickness degree of a font characterized by character weight;
The electronic equipment detects a first operation of changing the character weight;
the electronic equipment displays a text, and the word weight of the text is a second word weight;
the electronic equipment displays a second vector diagram of the first icon, and the line of the first icon presents a second thickness degree in the second vector diagram;
If the second word weight is less than the first word weight, the second thickness degree is finer than the first thickness degree;
the second degree of thickness is thicker than the first degree of thickness if the second word weight is greater than the first word weight.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
When the word weight of the text is the first word weight, displaying the first icon as the first vector diagram;
And when the word weight of the text is the second word weight, displaying the first icon as the second vector diagram.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The first vector diagram is obtained by the electronic device according to the first word weight, a third vector diagram of the first icon and a fourth vector diagram of the first icon, wherein the thickness degree of the lines of the first icon in the first vector diagram is greater than or equal to the thickness degree of the lines of the first icon in the third vector diagram, and is less than or equal to the thickness degree of the lines of the first icon in the fourth vector diagram;
The second vector diagram is obtained by the electronic device according to the second word weight, the third vector diagram of the first icon and the fourth vector diagram of the first icon, wherein the thickness degree of the lines of the first icon in the second vector diagram is greater than or equal to the thickness degree of the lines of the first icon in the third vector diagram, and is smaller than or equal to the thickness degree of the lines of the first icon in the fourth vector diagram.
4. The method of claim 2, wherein the step of determining the position of the substrate comprises,
The first vector diagram is obtained by the electronic device according to the first word weight, a third vector diagram of the first icon and a fourth vector diagram of the first icon, wherein the thickness degree of the lines of the first icon in the first vector diagram is greater than or equal to the thickness degree of the lines of the first icon in the third vector diagram, and is less than or equal to the thickness degree of the lines of the first icon in the fourth vector diagram;
The second vector diagram is obtained by the electronic device according to the second word weight, the third vector diagram of the first icon and the fourth vector diagram of the first icon, wherein the thickness degree of the lines of the first icon in the second vector diagram is greater than or equal to the thickness degree of the lines of the first icon in the third vector diagram, and is smaller than or equal to the thickness degree of the lines of the first icon in the fourth vector diagram.
5. The method of claim 3, wherein the step of,
The character weight of the text comprises a third character weight and a fourth character weight, and the degree of the thickness of the font represented by the first character weight or the second character weight is larger than or equal to the degree of the thickness represented by the third character weight and smaller than or equal to the degree of the thickness represented by the fourth character weight.
6. The method of claim 5, wherein the step of determining the position of the probe is performed,
When the word weight of the text is the third word weight, displaying the first icon as the third vector diagram;
and when the word weight of the text is the fourth word weight, displaying the first icon as the fourth vector diagram.
7. The method according to any one of claims 3 to 6, wherein,
The first vector diagram comprises a first path, the second vector diagram comprises a second path, the third vector diagram comprises a third path, the fourth vector diagram comprises a fourth path, the first path, the second path, the third path and the fourth path correspond to the same line in the first icon, the first path is calculated by the electronic equipment according to the first word weight, the third path and the fourth path, and the second path is calculated by the electronic equipment according to the second word weight, the third path and the fourth path.
8. The method according to any one of claims 1 to 6, wherein,
The electronic equipment displays a text, the word weight of the text is a first word weight, the electronic equipment displays a first vector diagram of a first icon, and a line of the first icon presents a first thickness degree in the first vector diagram, and the method specifically comprises the following steps:
The electronic equipment displays a user interface, wherein the user interface comprises a status bar, a first text for indicating a mobile operator and a first vector diagram for indicating a first icon of wireless communication signal strength are displayed in the status bar, the word weight of the first text is a first word weight, and a line of the first icon presents a first thickness degree in the first vector diagram;
The electronic device displays a text, the word weight of the text is a second word weight, the electronic device displays a second vector diagram of the first icon, and the line of the first icon presents a second thickness degree in the second vector diagram, and the method specifically comprises the following steps:
The electronic equipment displays a user interface, wherein the user interface comprises a status bar, a second text used for indicating the mobile operator and a second vector diagram of the first icon used for indicating the wireless communication signal strength are displayed in the status bar, the word weight of the second text is a second word weight, and the line of the first icon presents a second thickness degree in the second vector diagram.
9. The method of claim 7, wherein the step of determining the position of the probe is performed,
The electronic equipment displays a text, the word weight of the text is a first word weight, the electronic equipment displays a first vector diagram of a first icon, and a line of the first icon presents a first thickness degree in the first vector diagram, and the method specifically comprises the following steps:
The electronic equipment displays a user interface, wherein the user interface comprises a status bar, a first text for indicating a mobile operator and a first vector diagram for indicating a first icon of wireless communication signal strength are displayed in the status bar, the word weight of the first text is a first word weight, and a line of the first icon presents a first thickness degree in the first vector diagram;
The electronic device displays a text, the word weight of the text is a second word weight, the electronic device displays a second vector diagram of the first icon, and the line of the first icon presents a second thickness degree in the second vector diagram, and the method specifically comprises the following steps:
The electronic equipment displays a user interface, wherein the user interface comprises a status bar, a second text used for indicating the mobile operator and a second vector diagram of the first icon used for indicating the wireless communication signal strength are displayed in the status bar, the word weight of the second text is a second word weight, and the line of the first icon presents a second thickness degree in the second vector diagram.
10. An electronic device, the electronic device comprising: a display screen, a memory and a processor coupled to the memory, a plurality of application programs, and one or more programs; stored in the memory are computer executable instructions that, when executed by the processor, cause the electronic device to implement the method of any one of claims 1 to 9.
11. A computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 9.
12. A computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform the method of any one of claims 1 to 9.
CN202110346370.0A 2021-03-31 2021-03-31 Icon processing method and electronic equipment Active CN115145436B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110346370.0A CN115145436B (en) 2021-03-31 2021-03-31 Icon processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110346370.0A CN115145436B (en) 2021-03-31 2021-03-31 Icon processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115145436A CN115145436A (en) 2022-10-04
CN115145436B true CN115145436B (en) 2024-05-03

Family

ID=83403527

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110346370.0A Active CN115145436B (en) 2021-03-31 2021-03-31 Icon processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115145436B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117519859A (en) * 2022-07-30 2024-02-06 荣耀终端有限公司 Interface adjustment method and electronic equipment
CN116702701A (en) * 2022-10-26 2023-09-05 荣耀终端有限公司 Word weight adjusting method, terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144619A (en) * 2017-06-14 2019-01-04 阿里巴巴集团控股有限公司 Icon font information processing method, apparatus and system
CN110109531A (en) * 2019-05-07 2019-08-09 北京字节跳动网络技术有限公司 Display control method, device, electronic equipment and storage medium
CN110442313A (en) * 2019-06-27 2019-11-12 华为技术有限公司 A kind of display properties method of adjustment and relevant device
CN110908765A (en) * 2019-11-29 2020-03-24 五八有限公司 Interface display method and device, terminal equipment and storage medium
CN111258700A (en) * 2020-01-22 2020-06-09 华为技术有限公司 Icon management method and intelligent terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7730403B2 (en) * 2006-03-27 2010-06-01 Microsoft Corporation Fonts with feelings
US20070240057A1 (en) * 2006-04-11 2007-10-11 Microsoft Corporation User interface element for displaying contextual information
US10345996B2 (en) * 2008-10-22 2019-07-09 Merge Healthcare Solutions Inc. User interface systems and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144619A (en) * 2017-06-14 2019-01-04 阿里巴巴集团控股有限公司 Icon font information processing method, apparatus and system
CN110109531A (en) * 2019-05-07 2019-08-09 北京字节跳动网络技术有限公司 Display control method, device, electronic equipment and storage medium
CN110442313A (en) * 2019-06-27 2019-11-12 华为技术有限公司 A kind of display properties method of adjustment and relevant device
CN110908765A (en) * 2019-11-29 2020-03-24 五八有限公司 Interface display method and device, terminal equipment and storage medium
CN111258700A (en) * 2020-01-22 2020-06-09 华为技术有限公司 Icon management method and intelligent terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
苹果与华为手机UI设计比较分析;朱迪;;大众文艺(05);第60-61页 *

Also Published As

Publication number Publication date
CN115145436A (en) 2022-10-04

Similar Documents

Publication Publication Date Title
CN109814766B (en) Application display method and electronic equipment
CN115473957B (en) Image processing method and electronic equipment
US20220107821A1 (en) User interface layout method and electronic device
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN114115769A (en) Display method and electronic equipment
CN115145436B (en) Icon processing method and electronic equipment
CN116048243B (en) Display method and electronic equipment
CN114222187B (en) Video editing method and electronic equipment
WO2022022406A1 (en) Always-on display method and electronic device
CN115964231A (en) Load model-based assessment method and device
CN114004732A (en) Image editing prompting method and device, electronic equipment and readable storage medium
CN113448658A (en) Screen capture processing method, graphical user interface and terminal
CN115729427A (en) Message prompting method and electronic equipment
CN117769696A (en) Display method, electronic device, storage medium, and program product
CN115994006A (en) Animation effect display method and electronic equipment
CN113821130A (en) Method and related device for determining screenshot area
CN113495733A (en) Theme pack installation method and device, electronic equipment and computer readable storage medium
CN117764853B (en) Face image enhancement method and electronic equipment
CN116343247B (en) Form image correction method, device and equipment
CN113986406B (en) Method, device, electronic equipment and storage medium for generating doodle pattern
CN114942741B (en) Data transmission method and electronic equipment
CN116382825B (en) Interface layout method and device
WO2024067551A1 (en) Interface display method and electronic device
WO2024041180A1 (en) Path planning method and apparatus
CN117764853A (en) Face image enhancement method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant