CN115424569A - Display control method, electronic equipment and display system - Google Patents

Display control method, electronic equipment and display system Download PDF

Info

Publication number
CN115424569A
CN115424569A CN202211046940.5A CN202211046940A CN115424569A CN 115424569 A CN115424569 A CN 115424569A CN 202211046940 A CN202211046940 A CN 202211046940A CN 115424569 A CN115424569 A CN 115424569A
Authority
CN
China
Prior art keywords
image
display
displayed
character
dot matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211046940.5A
Other languages
Chinese (zh)
Inventor
孟效轲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XFusion Digital Technologies Co Ltd
Original Assignee
XFusion Digital Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XFusion Digital Technologies Co Ltd filed Critical XFusion Digital Technologies Co Ltd
Priority to CN202211046940.5A priority Critical patent/CN115424569A/en
Publication of CN115424569A publication Critical patent/CN115424569A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • G09G3/32Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]

Abstract

The invention provides a display control method, electronic equipment and a display system. The method applied to the electronic equipment comprises the following steps: determining first data for indicating an object to be displayed; generating a bitmap image of an object to be displayed based on the first data; determining a first control image adapted to a first display size based on the bitmap image; the dot matrix display comprises a dot matrix display screen formed by a plurality of light-emitting devices, and the first display size indicates the size of an area on the dot matrix display screen, wherein the area is used for displaying an object to be displayed; and sending the data of the first control image to the dot matrix display, and instructing the dot matrix display to control the display color value of the light-emitting device based on the data of the first control image so as to realize the display of the object to be displayed. The control image of the object to be displayed is generated by depending on electronic equipment except the dot matrix display, so that the dot matrix display does not need to carry a specific dot matrix font library, and the display of any content can be realized.

Description

Display control method, electronic equipment and display system
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a display control method, an electronic device, and a display system.
Background
With the increasing application of LED (Light Emitting Diode) display devices, the LED display devices are more widely applied to electronic bus guideboards and billboards in recent years, and the variability of the display content is a trend of electrification and intelligent development, which replaces the traditional rigid electronic bus guideboards and billboards, and is popular among people. As users increase, the demand tends to diversify.
Currently, an LED display includes an LED display screen formed by an LED dot matrix and a dot matrix controller, and a dot matrix word library of a specific language is integrated in the dot matrix controller. The lattice character library is a data file, and lattice data of all characters are stored in the data file. For the character module data of the characters stored in the dot matrix character library, for the 16x16 dot matrix Chinese character library, each Chinese character occupies 32 bytes, 16 rows are formed from top to bottom, and 16 bits are formed by two bytes in each row. The storage of the character pattern data of the dot matrix Chinese characters is standard and is divided into a transverse mode taking mode and a longitudinal mode taking mode: the horizontal modulus taking is that the matrix data is stored according to the sequence from left to right and from top to bottom in a row unit; the longitudinal modulo is that the matrix data is stored in the order of the previous 8 rows from top to bottom and from left to right and the next 8 rows from top to bottom and from left to right by taking the column as the unit.
Further, after the user inputs the characters, the dot matrix controller can inquire the positions of the characters in the dot matrix character library, read the character pattern data of the characters based on the positions, and control the LED display screen to display the characters based on the character pattern data.
However, the common dot matrix word stock has limited resources and few choices in the aspects of fonts, glyphs and the like, so that the use is inconvenient.
The information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
The embodiment of the invention provides a display control method, electronic equipment and a display system, which rely on the electronic equipment except a dot matrix display to generate a bitmap image of an object to be displayed, and obtain a control image for controlling the display color value of a light-emitting device in the dot matrix display based on the bitmap image, so that the dot matrix display does not need to carry a specific dot matrix font library, and can realize the quick conversion and output of any character and image.
In a first aspect, an embodiment of the present invention provides a display control method, where an electronic device receives first data, where the first data is used to indicate an object to be displayed; the electronic equipment generates a bitmap image of an object to be displayed based on the first data; the electronic device determines, based on the bitmap image, a first control image adapted to a first display size; the dot matrix display includes a display screen composed of a plurality of light emitting devices, and a pixel value of each pixel in the first control image indicates a display color value of the light emitting device in the dot matrix display; the first display size indicates the size of an area on the dot matrix display screen for displaying the object to be displayed; and the electronic equipment sends the data of the first control image to the dot matrix display, and instructs the dot matrix display to control the display color value of the light-emitting device based on the data of the control image so as to realize the display of the object to be displayed.
According to the scheme, a bitmap image of an object to be displayed is generated by electronic equipment except the dot matrix display, and based on the bitmap image, the control image for controlling the display color value of the light-emitting device in the dot matrix display is obtained, so that the dot matrix display does not need to carry a specific dot matrix word stock, and the rapid conversion and output of any character and image can be realized.
In a possible implementation manner, the first data is a first identifier of the object to be displayed.
Optionally, the object to be displayed is a character, and the first identifier is a character code and a font.
Optionally, the object to be displayed is an image, and the first identifier is an image storage address.
In a possible implementation manner, the first data is description data of an object to be displayed.
Optionally, the object to be displayed is a character, and the description data is a bitmap image or vector data representing the character.
Optionally, the object to be displayed is an image, and the description data is an image file, or data obtained by decoding the image file.
In one possible implementation, the bitmap image is a binary image, and the control image is the bitmap image when the first display size of the object to be displayed is the same.
In one possible implementation, the method further comprises: the electronic equipment acquires operation information of a user; the electronic device determines first data based on the operation information of the user.
According to the scheme, the electronic equipment can obtain the first data through user interaction and user operation through the user interaction interface, so that the obtained first data can meet the user requirements.
In one possible implementation manner, the determining, by the electronic device, a first control image adapted to a first display size based on a bitmap image includes: the electronic equipment determines the character size; wherein the character size indicates the size of an area on the dot matrix display screen where the character is displayed; the electronic equipment respectively processes the bitmap images corresponding to the characters based on the character sizes to obtain character control images corresponding to the characters with the character sizes; the electronic device stitches character control images corresponding to the characters and determines a first control image adapted to a first display size of the character string.
In the scheme, the characters are controlled to be spliced, so that the layout of a plurality of characters in the character string can be flexibly designed, and various scene requirements can be met.
In one possible implementation, when the bitmap image is not a binary image, the electronic device determines, based on the bitmap image, a control image adapted to a first display size, including: the electronic equipment binarizes the bitmap image to obtain a binary image; the electronic equipment scales the binary image based on the first display size to obtain a first control image adapted to the first display size.
In the scheme, the bitmap image is zoomed, so that the electronic equipment obtains the control image which can be butted with the dot matrix display, and the dot matrix display is displayed based on the control image sent by the electronic equipment and is not limited by the resource limitation of the dot matrix display.
In one possible implementation, scaling the binary image to obtain a control image adapted to the first display size includes: the electronic equipment determines at least one decision tree of an object to be displayed; the at least one decision tree is used for carrying out pixel aggregation on first window data of a first window on the binary image; the electronic equipment conducts pixel aggregation on first window data in the process that the first window traverses the binary image through at least one decision tree to obtain control images adaptive to the first display size.
In the scheme, a pixel aggregation mode method can be flexibly designed through the decision tree, and various scene requirements can be met.
In one possible implementation, the method further includes: when a plurality of objects to be displayed exist, the electronic equipment determines position information corresponding to the plurality of objects to be displayed; the position information indicates an area where a first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen; the electronic equipment sends the position information and the first control images corresponding to the objects to be displayed to the dot matrix display, and instructs the dot matrix display to display the objects to be displayed based on the position information and the first control images corresponding to the objects to be displayed.
According to the scheme, the layout of the respective first control images of the plurality of objects to be displayed is performed through the position information of the plurality of objects to be displayed, so that the dot matrix display displays the plurality of objects to be displayed, and various scene requirements can be met.
In one possible implementation, the method further comprises: when a plurality of objects to be displayed exist, the electronic equipment determines position information corresponding to the plurality of objects to be displayed; the position information indicates an area where a first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen; the electronic equipment processes the first control images corresponding to the plurality of objects to be displayed respectively based on the position information and the second display size of the dot matrix display, and determines second control images adaptive to the second display size; the electronic device sends the second controller image to the dot matrix display, and instructs the dot matrix display to display the plurality of objects to be displayed based on the second control image.
According to the scheme, the layout of the first control images of the objects to be displayed is carried out through the position information of the objects to be displayed, so that the second control image which is adaptive to the display size of the dot matrix display is obtained, and the dot matrix display displays the objects to be displayed based on the second control image.
In a second aspect, an embodiment of the present invention provides a display control apparatus/electronic device, where the display control apparatus/electronic device includes several modules, and each module is configured to execute each step in the display control method provided in the first aspect of the present invention, and the division of the modules is not limited herein. For specific functions executed by each module of the display control apparatus/electronic device and the achieved beneficial effects, please refer to the functions of each step of the display control method provided in the first aspect of the embodiment of the present invention, which are not described herein again.
Illustratively, a display control apparatus/electronic device includes:
the display device comprises a data determining module, a display module and a display module, wherein the data determining module is used for determining first data, and the first data is used for indicating an object to be displayed;
the image generation module is used for generating a bitmap image of the object to be displayed based on the first data;
an adjustment module to determine a first control image adapted to a first display size based on the bitmap image; the pixel value of each pixel in the first control image indicates a display color value of a light emitting device in a dot matrix display, the dot matrix display comprises a dot matrix display screen formed by a plurality of light emitting devices, and the first display size indicates the size of an area on the dot matrix display screen, where an object to be displayed is displayed;
and the sending module is used for sending the data of the first control image to the dot matrix display, and instructing the dot matrix display to control the display color value of the light-emitting device based on the data of the first control image so as to realize the display of the object to be displayed.
In one possible implementation manner, the first data is a first identifier of an object to be displayed. Optionally, the object to be displayed is a character, and the first identifier is a character code and a font. Optionally, the object to be displayed is an image, and the first identifier is an image storage address.
In a possible implementation manner, the first data is description data of an object to be displayed. Optionally, the object to be displayed is a character, and the description data is a bitmap image or vector data representing the character. Optionally, the object to be displayed is an image, and the description data is an image file, or data obtained by decoding the image file.
In one possible implementation, the bitmap image is a binary image, and the control image is the bitmap image when the first display size of the object to be displayed is the same.
In one possible implementation, the display control apparatus/electronic device further includes: the first data acquisition module is used for acquiring operation information of a user; the first data is determined based on the operation information of the user.
In a possible implementation manner, the object to be displayed is a character string, the character string includes a plurality of characters, the bitmap image includes bitmap images corresponding to the plurality of characters, and the adjusting module includes: a size determination unit for determining a character size; wherein the character size indicates a size of an area on the dot matrix display screen where the character is displayed; the first adjusting unit is used for respectively processing the bitmap images corresponding to the characters based on the character sizes to obtain character control images corresponding to the characters with the character sizes; and the splicing unit is used for splicing the character control images corresponding to the characters and determining a first control image which is adaptive to the first display size of the character string.
In one possible implementation, when the bitmap image is not a binary image, the adjusting module includes: the binarization unit is used for binarizing the bitmap image to obtain a binary image; and the second adjusting unit is used for scaling the binary image based on the first display size to obtain a first control image adaptive to the first display size.
In a possible implementation manner, the second adjusting unit is configured to perform the following: determining at least one decision tree of an object to be displayed; the at least one decision tree is used for carrying out pixel aggregation on first window data of a first window on the binary image; and performing pixel aggregation on first window data in the process that the first window traverses the binary image through at least one decision tree to obtain a control image which is adaptive to the first display size.
In one possible implementation, the apparatus further includes: the device comprises a position determining module, a display module and a display module, wherein the position determining module is used for determining position information corresponding to a plurality of objects to be displayed when the number of the objects to be displayed is multiple; the position information indicates an area where a first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen; the sending module is used for sending the position information and the first control images corresponding to the plurality of objects to be displayed to the dot matrix display, and the dot matrix display is indicated to display the plurality of objects to be displayed based on the position information and the first control images corresponding to the plurality of objects to be displayed.
In one possible implementation, the apparatus further includes: the device comprises a position determining module, a display module and a display module, wherein the position determining module is used for determining position information corresponding to a plurality of objects to be displayed when the number of the objects to be displayed is multiple; the position information indicates an area where a first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen; the layout adjusting module is used for processing the first control images corresponding to the plurality of objects to be displayed respectively based on the position information and the second display size of the dot matrix display and determining second control images adaptive to the second display size; and the sending module is used for sending the second controller image to the dot matrix display and indicating the dot matrix display to display a plurality of objects to be displayed based on the second control image.
In a third aspect, an embodiment of the present invention provides a display control apparatus/electronic device, including: at least one memory for storing a program; at least one processor configured to execute the memory-stored program, the processor configured to perform the method provided in the first aspect when the memory-stored program is executed.
In a fourth aspect, an embodiment of the present invention provides a display control apparatus/electronic device, where the apparatus/electronic device executes computer program instructions to perform the method provided in the first aspect.
In one example, the apparatus/electronic device may include a processor, which may be coupled with a memory, read instructions in the memory and execute the method provided in the first aspect according to the instructions. The memory may be integrated in the chip or the processor, or may be independent of the chip or the processor.
In a fifth aspect, an embodiment of the present invention provides a display control system, which includes an electronic device and a dot matrix display, where the electronic device is configured to implement the method provided in the first aspect.
In a sixth aspect, an embodiment of the present invention provides a computer storage medium, in which instructions are stored, and when the instructions are executed on a computer, the instructions cause the computer to execute the method provided in the first aspect.
In a seventh aspect, an embodiment of the present invention provides a computer program product containing instructions, which when executed on a computer, cause the computer to execute the method provided in the first aspect.
Drawings
FIG. 1 is a schematic diagram of a display scheme of a dot matrix display;
FIG. 2 is a schematic diagram of a display scheme provided by an embodiment of the present invention;
fig. 3 is a schematic flowchart of a display control method according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart of step 301 in FIG. 3;
FIG. 5a is a first flowchart of step 303 in FIG. 3;
FIG. 5b is a schematic flow chart II of step 303 in FIG. 3;
FIG. 6a is a schematic diagram of a decision tree according to an embodiment of the present invention;
FIG. 6b is a diagram illustrating a pixel aggregation for character super according to an embodiment of the present invention;
FIG. 7 is a flow chart illustrating another display control method according to an embodiment of the present invention;
FIG. 8 is a flowchart illustrating another display control method according to an embodiment of the present invention;
FIG. 9a is a schematic diagram of a display of a first page provided in accordance with an embodiment of the present invention;
FIG. 9b is a schematic diagram of a second page display provided by an embodiment of the present invention;
fig. 10 is a flowchart illustrating a method for controlling display of a character string according to an embodiment of the present invention;
fig. 11 is a flowchart illustrating an image display control method according to an embodiment of the present invention;
fig. 12 is a flowchart illustrating a video display control method according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be described below with reference to the accompanying drawings.
In the description of the embodiments of the present invention, words such as "exemplary," "for example," or "for example" are used to indicate examples, illustrations, or illustrations. Any embodiment or design described as "exemplary," "for example," or "for example" in embodiments of the invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words "exemplary," "e.g.," or "exemplary" is intended to present relevant concepts in a concrete fashion.
In the description of the embodiment of the present invention, the term "and/or" is only one kind of association relationship describing the association object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, B exists alone, and A and B exist at the same time. In addition, the term "plurality" means two or more unless otherwise specified. For example, the plurality of systems refers to two or more systems, and the plurality of terminals refers to two or more terminals.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit indication of indicated technical features. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Hereinafter, some terms in the present embodiment will be explained. It should be noted that these explanations are for the convenience of those skilled in the art, and do not limit the scope of the present invention as claimed.
A light emitting device: is a device that converts electric energy into light energy, and can be produced from a compound semiconductor of group III and group V elements on the periodic table or the like, and various colors can be expressed by adjusting the composition ratio of the compound semiconductor.
Light emitting diode (abbreviated LED): a commonly used luminescent device can efficiently convert electric energy into light energy by releasing energy through electron and hole recombination to emit light, and has wide application, such as display screens, lighting lamps and the like.
Lattice: a square lattice of light emitting devices. Common specifications are 64x64, 32x32, 16x16, etc. By controlling the on-off of part of the light-emitting devices, the information such as characters, symbols, figures, animation and the like can be displayed on the dot matrix. When the light emitting devices in the lattice are LEDs, the lattice may be referred to as an LED lattice.
Dot matrix display: the device at least comprises a dot matrix display screen, a dot matrix controller and a power supply. The intelligent bus stop indicator is widely applied in daily life, and can be applied to scenes such as bus stop tellers, highway electronic signposts and equipment running state information display. Here, the purpose of the power supply is to convert an externally input ac power, which is usually 220V ac, into various dc powers to be supplied to various circuits. The dot matrix display screen is formed by arranging dot matrixes in a matrix mode, and the controller can realize information display by controlling the on and off of the light-emitting devices in the dot matrixes. When the dot matrix is an LED dot matrix, the dot matrix display screen can be called an LED dot matrix display screen.
The client side comprises: a processing program is installed on an electronic device and used for interaction between a user and a dot matrix display (which can be regarded as a server side), so that a dot matrix display screen in the dot matrix display displays content which the user wants to display. Here, the electronic device and the dot matrix display may communicate through a network.
Operating System (OS): computer programs for managing hardware and software resources of an electronic device. The operating system needs to handle basic transactions such as managing and configuring memory, prioritizing system resources, controlling input devices and output devices, operating the network, and managing the file system. The operating system also provides an interface for the user to interact with the system.
Canvas: a virtual rectangular area on which paths, rectangles, circles, characters, and images can be drawn and rendered. The canvas supports pixel-level operations, and the user can also implement higher-level image processing operations on the canvas, such as showing animations, 3D graphics, and the like.
File: and information stored on the electronic equipment by taking a hard disk of the electronic equipment as a carrier.
Character: the general term of characters and symbols includes characters, numbers, letters, punctuation marks, graphic symbols, etc.
Pixel: the smallest unit that is indivisible in the whole image has a well-defined position and color value. An image is composed of a plurality of pixels with different positions and color values, and on the premise that the image is not subjected to compression processing, the more pixels in a unit area, the higher the resolution is represented, and the displayed image is close to a real object.
Bitmap image (BMP for short): also called a dot matrix image or a raster image, is composed of individual dots called pixels (picture elements). The dots can be arranged and dyed differently to form a pattern. When the bitmap is enlarged, a myriad of individual squares upon which the entire image is built can be seen. The effect of enlarging the size of the bitmap is to increase the individual pixels, so that lines and shapes appear jagged. However, if it is viewed from a slightly distant position, the color and shape of the bitmap image again appear continuous. Photographs taken with a digital camera, images scanned by a scanner, computer screenshots, and the like all belong to bitmaps. The bitmap has the characteristics of representing color change and color subtle transition and generating vivid effect, and has the defects of recording the position and color value of each pixel when in storage and occupying larger storage space.
Binarization of the bitmap image: the gray value of the pixel point on the bitmap image is set to be 0 or 255, that is, the whole image has an obvious visual effect of only black and white, and at the moment, the obtained image can be called a black-and-white image or a binary image. The pixel is black when the gray value =0 and the pixel is white when the gray value = 255.
Image file: a computer disk file depicting one image. The format of the image file may be a JPEG (joint photographic Experts Group) format, a BMP format, or the like, which is not specifically limited in the embodiment of the present invention and may be specifically determined in combination with actual requirements. Here, the BMP format is an image file storage format adopted by Windows, and this format is supported by all image processing software running in the Windows environment. The BMP format image is the most original image by analogy with the source code, and the data of how many points the BMP format image needs to store is the data of how many points actually. In practical applications, a real image occupies a large storage space, and in order to save the storage space, the real image is usually encoded, that is, compressed to obtain an image file. After reading the image file, in order to obtain a real image, image decoding needs to be performed on the image file to obtain an image.
Video files: computer disk files for describing animations, animations and audio. Herein, animation refers to an image sequence composed of several frames of still images which are related to each other, and these still images are continuously broadcasted to form a group of animation, which is usually used to complete dynamic process demonstration. In practical applications, a real video occupies a large storage space, and in order to save the storage space, the real video is usually encoded, that is, compressed to obtain a video file. After reading the video file, in order to obtain a real image sequence, video decoding needs to be performed on the video file to obtain the image sequence.
A word stock: an image source for displaying characters stores shape information of each character under a font. The pass file (referred to as a word stock file for convenience of description and distinction) is used to store a character file for each of a plurality of characters in a certain font. Each character displayed on its real computer, regardless of the language in which it is in, is a small pattern. The word stock is to store the small patterns in some form of image as a character file, and restore the small patterns to the image when the small patterns need to be displayed. The word stock in the windows operating system is stored under a windows/fonts folder of a system disk, and the word stock in the linux operating system is stored under a usr/share/fonts/folder.
The character library has two types, one is a dot matrix character library, and the other is a vector character library.
The dot matrix font library stores dot matrix data, i.e., font data, of characters, and bitmap images can be directly generated based on the data. Illustratively, the lattice word stock can be an HZK16 word stock, the HZK16 word stock is a 16x16 lattice word stock conforming to the GB2312-80 standard of the people's republic of China, and the HZK16 word stock supports 6763 common Chinese characters, wherein 3755 primary Chinese characters are stored according to the sound order, 3008 secondary Chinese characters are stored according to the radicals of the components. In addition, the HZK16 font library also supports 682 common symbols. Similarly, other languages also have corresponding dot matrix word stock files. In the HZK16 font file, the position index information of each character is formed by two bytes, the former byte is the region code, and the latter byte is the bit code.
Wherein the vector font library stores vector data of characters based on which a vector image can be generated. Vector graphics refer to graphic data forms describing the geometric characteristics of a graphic or graphic image area by specific graphic types (such as points, lines, circles, cones, and the like) and drawing parameters (such as coordinates, line widths, color values, and the like), or composite vector graphics composed of a plurality of single vector graphics-such as vector glyphs; for example, a two-dimensional ring can be drawn by describing the circle center, the outer ring radius, the inner ring radius, the color, the edge line type, the line width and the edge color; the three-dimensional sphere under the light irradiation can be drawn very vividly only by describing the sphere center coordinate, the sphere radius, the sphere color, the sphere reflectivity/transparency, the light source coordinate and the light color of the three-dimensional sphere; for another example, a character can be displayed by only giving a character number, a font size, and a color without transferring a pixel value at each coordinate point on the character. Correspondingly, the vector data represents the geometric characteristics of the characters, and the graphics are drawn through the geometric characteristics to obtain a vector image; in addition, vector images have the following advantages over bitmap images: (1) small data size: only the type of the graph and simple drawing parameters need to be described, and a complex graph can be described. (2) No distortion (jagged edges) is present for zooming in and out: for the dot-matrix diagram, if the dot-matrix diagram is amplified, each pixel needs to be amplified, so that a sawtooth edge can appear; and after the vector image is amplified, the vector image is operated and displayed according to the amplified proportion, and the edge is still smooth. Based on this, in practical applications, the vector word stock is usually selected in consideration of the large storage space occupied by the dot matrix word stock.
Character set: storing a set of Character encodings (Character encoding); character encoding is a unified set of standards to store and transfer information between different computers. The character set may include character sets of different writing systems. For example, english is ASCII (american standard Code for Information exchange) character set, simplified Chinese is GBK (Chinese international Code Specification, chinese character Code character set), and traditional is Big5 (Chinese text (old version)) character set; and can also include a Unicode character set (a uniform and unique binary code is set for each character in each language so as to meet the requirements of cross-language and cross-platform).
Decision tree: a tree logic structure may be a binary tree or a non-binary tree. Each non-leaf node represents a test on a feature attribute, each branch represents the output of the feature attribute over a range of values, and each leaf node stores a category. The process of using the decision tree to make a decision is to start from the root node, test the corresponding characteristic attributes in the items to be classified, select an output branch according to the value of the characteristic attributes until the leaf node is reached, and take the category stored by the leaf node as a decision result.
Fig. 1 is a schematic diagram of a display control method, as shown in fig. 1, in the related art, a dot matrix display in an LED display loads a dot matrix word stock, queries a position of a character input by a user in the dot matrix word stock, reads dot matrix data, i.e., word model data, of the character in the dot matrix word stock based on the position, converts the dot matrix data into a rendering matrix, and controls a dot matrix display screen in the LED display to display the character.
For the technical scheme, on one hand, the dot matrix word stock is relied on, and for special characters which do not exist in the dot matrix word stock, messy codes or blanks can be caused; on the other hand, graphics cannot be displayed.
Fig. 2 is a diagram illustrating an architecture of a display system according to an embodiment of the present invention. As shown in fig. 2, the display system includes the electronic apparatus 100 and the dot matrix display 200, and the electronic apparatus 100 and the dot matrix display 200 communicate through a network.
The network may be a wired network or a wireless network. Illustratively, the wired Network may be a cable Network, a fiber Network, a Digital Data Network (DDN), etc., and the Wireless Network may be a telecommunication Network, an intranet, the internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a WLAN, a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), a bluetooth Network, a ZigBee Network, a Mobile phone (GSM) for Mobile Communications, a CDMA (Code Division Multiple Access) Network, a CPRS (general packet radio Access) Network, etc., or any combination thereof. It is understood that the network may use any known network communication protocol to enable communication between the different client layers and the gateway, such as various wired or wireless communication protocols, such as ethernet, universal Serial Bus (USB), firewire (firewire), global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (TD-SCDMA), long Term Evolution (LTE), new air interface (newradio, NR), bluetooth (bluetooth), wireless security (Wi-Fi), and so on.
The electronic device 100 may be understood as a device having at least a function of communicating with a dot matrix display, a rendering function, and an image processing function, and may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, car machines, and the like. Exemplary embodiments of electronic device 100 as referred to in this document include, but are not limited to, an electronic device that hosts an iOS, android, windows, hong meng systems (Harmony OS), or other operating systems. In practical applications, the electronic device 100 is installed with a client 101 and an operating system 102, the client 101 communicates with the dot matrix display 200 through the electronic device 100, and the client 101 implements data processing by calling functions of the operating system 102.
Therein, the dot matrix display 200 includes a dot matrix controller 201 and a dot matrix display screen 202. The dot matrix display 202 is formed by arranging a plurality of light emitting devices in a matrix, and specifically, the dot matrix display 202 may be formed by arranging a plurality of light emitting devices in a dot matrix and arranging a plurality of dot matrices in a matrix. It should be noted that the dot matrix display 200 provided in the embodiment of the present invention can be widely used in scenes requiring frequently changed contents, such as highway signboards, large-screen signboards, bus stops announcers, etc., and the dot matrix display screen 202 in the dot matrix display 200 can provide rich content display forms, such as characters, images, videos, etc.
The embodiment of the invention realizes the fast conversion of any character and image and outputs the converted character and image to the dot matrix display for display by depending on the rendering function and the image processing function of the electronic equipment 100, so that the dot matrix display can display any content without carrying a specific dot matrix word stock.
Next, a display control method according to an embodiment of the present invention will be described. Wherein the method may be performed by any apparatus, device, platform, cluster of devices having computing, processing capabilities. For example, the electronic device 100 in fig. 2 is described below with the electronic device 100 as an example of an execution subject.
Fig. 3 is a flowchart illustrating a display control method according to an embodiment of the present invention. The method comprises the following steps:
step 301, determining first data, wherein the first data is used for indicating an object to be displayed.
According to one possible implementation, the first data is used to indicate an object to be displayed. The first data may be, for example, a first identification of the object to be displayed, or alternatively, description data of the object to be displayed. The description data of the object to be displayed may be data describing the object to be displayed.
In practical applications, the electronic device 100 stores a character set, a word stock, an image file (referred to as a local image file for convenience of description and distinction), a video file (referred to as a local video file for convenience of description and distinction); here, for convenience of description and distinction, an image described by the local image file is referred to as a local image, and a video described by the local video file is referred to as a local video. In some possible cases, other electronic devices than the electronic device 100 may store image files or video files of objects to be displayed. Here, for convenience of description and distinction, the image file stored by the other electronic device 100 is referred to as a remote image file, and the image described by the remote image file is referred to as a remote image. Video similarities are not described in detail. Fig. 2 shows an electronic device 300 and an electronic device 400, the electronic device 300 storing a remote video file, the electronic device 400 storing a remote video file and a remote image file.
In one example, the objects to be displayed may be characters. Correspondingly, the first identifier may be a character identifier (i.e., a first identifier). In particular, character identification may be understood as a marking of character data in a character set of a font that can be queried. Alternatively, the character identification can be character codes and fonts; here, the description data is character data in a character set of a font corresponding to the font, and the character data may be the vector data or the dot matrix data, that is, the character matrix data. It should be noted that, the electronic device 100 is equipped with the operating system 102, and the character set of the font carried by the operating system 102 in the embodiment of the present invention can meet the requirements of various characters, and is not limited to the display of characters that can be represented by the dot matrix font library.
In one example, the object to be displayed may be a character string. Wherein a character string may be understood as an array of several characters. For example, if the character is a word, the character string is a word or sentence formed by the word. Correspondingly, the first identifier may be a character identifier of each character in the character string, and the description data may be description data of each character in the character string.
In one example, the object to be displayed may be an image. Correspondingly, the first identifier may be an image identifier, as shown in fig. 2, the description data may be storage data, and considering that the image is usually encoded and stored in order to reduce the storage space, the description data may also be decoded data obtained by decoding the storage data. Here, the image identification may be a storage address of the image. The image here may be a local image or a remote image. The storage address may be an absolute path, which is a path from a drive letter. When the object to be displayed is a remote image, the storage address may be url (uniform resource locator).
In one example, the object to be displayed may be a video. Correspondingly, the first identifier may be a video identifier, the description data may be storage data, and considering that the video is usually encoded and stored in order to reduce the storage space, the description data may also be decoded data obtained by decoding the storage data. Here, the video identification may be a storage address of the video. The video here may be a local video or a remote video. The storage address may be an absolute path, which is a path from a drive letter. When the object to be displayed is a remote video, the storage address may be url (uniform resource locator).
It should be noted that, in practical applications, there may be a plurality of objects to be displayed, and the plurality of objects to be displayed may be any one or more of the above characters, images or videos.
Step 302, generating a bitmap image of the object to be displayed based on the first data.
According to one possible implementation, the electronic device 100 has a module (referred to as a rendering module for ease of description and distinction) that creates a canvas and a rendering function, and the rendering module is configured to generate a bitmap image of an object to be displayed based on the first data. In practical applications, the rendering module may be understood as a functional module of the operating system 102 installed in the electronic device 100.
It should be noted that, in the embodiment of the present invention, the rendering function of the operating system 102 itself is utilized to meet various requirements for image rendering, and is not limited to the display of characters represented by the dot matrix word library. In practical applications, the electronic device 100 calls the rendering module through the operating system 102 to obtain a bitmap image of the object to be displayed.
In addition, when the rendering module is called, a request needs to be generated based on parameter values of input parameters required by the rendering module (for convenience and distinction, referred to as a rendering request); the electronic device 100 responds to the rendering request and calls a rendering module to obtain a bitmap image of the object to be displayed. Here, in order for the rendering module to be invoked, the input parameters include at least an identification of the rendering module, such that operating system 102 can invoke the rendering module based on the identification. Typically the identification of the rendering module may be considered the name of the rendering module. Further, the rendering module further comprises a canvas size and the first data determined in step 301. It is to be noted that the canvas size may be a default or may be set by a user, which is not specifically limited in the embodiment of the present invention and may be determined specifically by combining with actual requirements. Note that the canvas size is typically much larger than the size of the dot matrix display screen 202. In practical applications, considering that the dot matrix is square, the length and width of the canvas size are the same for the convenience of processing, in other words, the number of rows and columns of the bitmap image are the same. It is noted that the content of the first data needs to be determined in combination with input parameters of the rendering module.
The following describes the input parameters of the rendering module when the object to be displayed is a character, with reference to fig. 2.
Optionally, the first data included in the input parameter is a character identification. The rendering module may create a canvas that fits the size based on the canvas size, and then may read stored data of the character based on the character identification, and further, drawing the character on the canvas based on the stored data of the character may result in a bitmap image.
For the character identification and the stored data, refer to the description of step 301 above, and are not described in detail.
Still further, the input parameters of the rendering module may also include a brush color. Here, the brush is black in color, so that the resulting bitmap image is a black-and-white image, i.e., a binary image.
Optionally, the first data included in the input parameter is description data of a character. Here, the client 101 in the electronic apparatus 100 reads description data of the character based on the identification of the character. The rendering module may create a canvas that fits the size based on the canvas size, and then draw the characters on the canvas based on the description data of the characters to generate a bitmap image.
The following describes the input parameters of the rendering module when the object to be displayed is a character string, with reference to fig. 2.
Optionally, the input parameter comprises first data identifying a character of each character in the character string. Further, the input parameters further include a size of the character (for convenience of description and distinction, referred to as a character size, which is represented by a number of rows and a number of columns), a number of columns of the character string (for convenience of description and distinction, which is referred to as a number of columns of the character string), a column interval, and a line interval, and the rendering module may create a canvas adapted to the size based on the canvas size, and then may read stored data of the character based on a character identifier of each character in the character string, and further, may draw the character string on the canvas based on the stored data of each character in the character string, the size of the character, the number of columns of the character string, the column interval, and the line interval to obtain a bitmap image. It should be noted that in practical applications, if there are fewer characters in the character string, there are only 1 line, and the line spacing is 0. In addition, each input parameter may have the number of the character string columns, at this time, when the electronic device 100 draws the character string on the canvas, the character may be drawn from left to right based on the character size, and if the length of the canvas is exceeded, the line is automatically changed, and the drawing is continued until the drawing of the character string is completed.
Optionally, the first data included in the input parameter is description data of each character in the character string. Further, the input parameters also comprise the number of columns of the character strings, the column spacing and the row spacing; here, the client 101 in the electronic apparatus 100 reads description data of the character based on the identification of each character in the character string. The rendering module may create a canvas that fits the size based on the canvas size, and then draw the character string on the canvas based on the description data of each character in the character string, the number of columns of the character string, the column spacing, and the row spacing to obtain a bitmap image.
The following describes the input parameters of the rendering module when the object to be displayed is an image or video, with reference to fig. 2. The images are similar to the video, and the difference is only that when the object to be displayed is the video, the rendering module generates an image sequence. The following description will be made taking an image as an example.
Optionally, the first data included in the input parameter may be an image identifier. Correspondingly, the rendering module in the electronic device 100 reads the stored data of the image based on the image identification, and generates a bitmap image based on the stored data of the image drawn on the canvas with the size of the created canvas.
Alternatively, the first data included in the input parameter may be stored data of the image. Here, the client 101 in the electronic apparatus 100 reads the stored data of the image based on the identification of the image. Further, the rendering module draws the generated bitmap image on the created canvas with the size of the canvas adapted based on the stored data of the image.
Optionally, the input parameters may also include decoding data of stored data of the image. Here, the client 101 in the electronic device 100 reads and decodes the stored data of the image based on the identification of the image, resulting in decoded data of the image. Further, the rendering module renders the generated bitmap image on the created canvas of the adapted canvas size based on the decoded data of the image. Here, the rendering data may be a bitmap image or vector data.
It should be noted that the bitmap image returned by the rendering module may be a color image or a black-and-white image. In practical applications, the rendering module does not usually change the color of the image or video, but the color of the character is controllable by the brush color.
Step 303, determining a first control image adapted to a first display size based on the bitmap image; the pixel value of each pixel in the first control image indicates a display color value of a light emitting device in a dot matrix display, the dot matrix display includes a dot matrix display screen composed of a plurality of light emitting devices, and the first display size indicates a size of an area on the dot matrix display screen where an object to be displayed is displayed.
The technical scheme provided by the embodiment of the invention can generate the bitmap image depending on the electronic equipment 100 and obtain the control image matched with the first display size according to the image processing function, so that the dot matrix display does not need to carry a specific dot matrix font library and can realize the display of any content.
Here, the first display size is represented by the number of rows and columns of the display area of the dot matrix display screen 202 in the dot matrix display 200 of the object to be displayed, and is generally represented by m (number of rows) × n (representing the number of columns), and is equal to or smaller than the second display size of the dot matrix display. Here, the second display size is a size of the dot matrix display 202 in the dot matrix display 200, and is represented by the number of rows and columns of the dot matrix display 202. It should be noted that the second display size is also expressed in m × n, and therefore the first display size and the second display size include the number of rows and the number of columns.
Here, the first display size may be a default size or a size set in advance by a human.
In one example, the first display size may be a default size. I.e. a first display size pre-stored in the electronic device 100.
For example, when the object to be displayed is a character, the first display size is a display size of the character, and the default size may be a size of a dot matrix of the dot matrix display screen 202, for example, the size of the dot matrix is 64 × 64, and the display size of the character is 64 × 64.
Illustratively, when the object to be displayed is an image or video, the default size may be the second display size of the dot matrix display screen 202.
In one example, the first display size may be a size input by a user via the electronic device 100 that is equal to or less than the second display size of the dot matrix display 202. For example, the size of the dot matrix is 64 × 64, and the second display size of the dot matrix display screen 202 is 128 × 320, then the first display size may be 64 × 128, which means that the dot matrix has a length of 64 and a width of 128, or 128 × 64, which means that the dot matrix has a length of 128 and a width of 64.
And step 304, sending the data of the first control image to the dot matrix display, and instructing the dot matrix display to control the display color value of the light-emitting device based on the data of the first control image so as to realize the display of the object to be displayed.
At present, the main implementation modes of the dot matrix display are as follows: inputting a first control image to the dot matrix controller 201, wherein the pixel value of each pixel in the image describes the display color value of one light emitting device, the dot matrix controller 201 is used for associating each pixel to different light emitting devices in the dot matrix display screen 202, and controlling the corresponding light emitting devices to emit colors according to the pixel values of the pixels, so as to display the object to be displayed on the dot matrix display screen 202. It is noted that the first control image is formed by arranging pixels in a matrix, essentially a binary matrix, with only two pixel values of the pixels representing two different colors. Alternatively, two kinds of pixel values are used to represent on and off, one is a pixel value indicating that the light emitting device is turned on (referred to as an on pixel value for convenience of description and distinction), and the other is a pixel value indicating that the light emitting device is turned off (referred to as an off pixel value for convenience of description and distinction). In one example, the on pixel value is a pixel value representing black and the off pixel value is a pixel value representing white. This will be described as an example.
It is noted that if there is only one object to be displayed, the dot matrix display will control the light emitting devices not associated to the pixels in the first control image based on the pixel values indicating off. In addition, the dot matrix controller will display the object to be displayed at the middle position of the dot matrix display screen. Of course, the electronic device 100 may also determine a display position of the object to be displayed, indicate a display area of the object to be displayed in the dot matrix display screen 202, and send the position to the dot matrix display 200, so that the dot matrix display 200 may display the character string at the display position.
The technical scheme provided by the embodiment of the invention can realize the fast conversion and output of any character and image to obtain the control image by depending on the functions of generating the bitmap image and processing the image by the electronic equipment, so that the dot matrix display can display the control image based on the electronic equipment, and can realize the display of any content without carrying a specific dot matrix character library.
Fig. 4 is a flowchart illustrating a display control method according to an embodiment of the present invention.
As shown in fig. 4, step 301 shown in fig. 3 at least includes the following steps:
step 3011, obtain the operation information of the user.
In the embodiment of the invention, the operation information indicates the operation of the user. Specifically, the electronic apparatus 100 may acquire operation information of the user when the user operates. Illustratively, the electronic device 100 has a touch screen, and the operation indicated by the operation information may be an operation of the touch screen by the user; illustratively, the electronic device 100 has a keyboard, and the operation indicated by the operation information may be an operation of the keyboard by the user; illustratively, the electronic device 100 has a mouse, and the operation indicated by the operation information is that the user inputs information by mouse operation, for example, selecting information in a list, copying and pasting information, and the like.
In one example, the operation information of the user may be a key code value of a character, a storage address of an image, a storage address of a video, and the like.
Step 3012, determine the first data based on the operation information of the user.
In practical applications, the electronic device 100 may know the object to be displayed that the user wants to display based on the operation information of the user, thereby determining the first data. The detailed content of the first data is referred to above and is not repeated.
In the embodiment of the invention, the content which the user wants to display is determined based on the interaction between the electronic equipment and the user, so that the user requirement is met.
Fig. 5a shows a schematic flowchart of a display control method according to an embodiment of the present invention.
As shown in fig. 5a, if the bitmap image is not a binary image, step 303 shown in fig. 3 at least includes the following steps:
step 3031, carrying out binarization on the bitmap image to obtain a binary image.
The electronic device 100 scans each pixel on the bitmap image according to rows and columns to obtain an RGB color value of each pixel, converts the RGB color value into a 256-color gray value, and stores the 256-color gray value into a corresponding gray matrix; and traversing the gray matrix to realize pixel aggregation by the window with the preset size according to the preset sequence, the step length in the row direction and the step length in the column direction, and completing binarization processing to obtain a binary image.
Here, for each sliding, the way the pixels of the grayscale matrix are aggregated is as follows:
determining window data of the window on the gray matrix, fusing the window data, determining a gray fusion value, and determining an aggregation result based on the gray fusion value. For example, when the grayscale fusion value is smaller than the preset threshold, the window data is aggregated into an open pixel value, otherwise, the window data is aggregated into a closed pixel value, and correspondingly, the aggregation result is the open pixel value or the closed pixel value. The fusion mode may be average, and correspondingly, the grayscale fusion value is an average value of grayscale values in the window data.
Further, after traversing the gray matrix, splicing the aggregation results after each sliding into a matrix according to a window sliding sequence, wherein the matrix is a binary image.
Here, the preset size may be determined in combination with actual requirements, and too large may cause distortion of an object to be displayed, and too small may increase the amount of calculation. For example, the predetermined size may be 1 × 1, and may also be 2 × 2. The preset sequence is usually from left to right and then from top to bottom, or from top to bottom and then from left to right, the step size in the row direction may be 2, and the step size in the column direction may be 2. Here, when the size of the window is 2 × 2, the preset order is from left to right and from top to bottom, and the step sizes in the row direction and the column direction are both 2, the size of the binary matrix is 1/2 of the grayscale matrix.
And 3032, based on the first display size, zooming the binary image to obtain a first control image which is suitable for the first display size.
Considering the number of rows and columns of the binary image, the number of rows and columns is generally larger than that in the first display size; because the resolution of the dot matrix display screen 202 is relatively low, the embodiment of the invention guarantees that the first control image can be as close as possible to the real situation of the object to be displayed by performing pixel aggregation on the binary image.
According to a possible implementation, the pixel aggregation of the binary image can be implemented in particular as follows.
(1) And determining the size of the window, the preset sequence, the step length in the row direction and the step length in the column direction based on the size of the binary matrix and the size of the display rectangle.
(2) And traversing the binary matrix to realize pixel aggregation through a window with a preset size according to a preset sequence, a row-direction step length and a column-direction step length to obtain a binary matrix with a smaller matrix size (for convenience of description and distinction, the binary matrix is called as a post-aggregation binary matrix).
(3) And (5) judging whether the size of the aggregated binary matrix is the size of a display rectangle, if so, executing (4), and if not, executing (5).
(4) And determining the aggregated binary matrix as an image to be rendered.
(5) And (4) replacing the binary matrix with the aggregated matrix, and executing (1).
In one example, for each sliding, the pixel aggregation of the window may be specifically implemented as follows.
Determining window data of a window on the binary matrix, and performing pixel aggregation through a decision tree to obtain an aggregation result; here, the aggregation result is an on pixel value or an off pixel value; i.e. aggregating the window data to open pixel values or close pixel values.
It is worth noting that, in the embodiment of the present invention, the requirement for pixel aggregation is relatively high, the effect of pixel aggregation of a single decision tree is limited, and it may not be ensured that the real situation of the object to be displayed can be reflected after the pixel aggregation. Therefore, in practical application, the window data can be subjected to pixel aggregation through a plurality of decision trees, and an aggregation result is obtained.
In a specific implementation, the result of each decision of the plurality of decision trees (for convenience of description and distinction, the result may be referred to as a decision result, and the result is an on pixel value or an off pixel value) may be determined. And counting the number of different decision results, and taking the decision result with the largest number as an aggregation result. It should be noted that in practical application, different decision trees can be set for different character types; for example, the characters of different character types may be characters, symbols, and figures; and setting different decision trees for different image types, for example, the different image types can be a person image, a traffic sign and an animal image.
Here, the decision logics of the decision trees are different, so that pixel aggregation can be realized from an irreversible perspective, and the effect of pixel aggregation is ensured. The decision logic of a decision tree is explained below.
The decision logic of the decision tree may be: when the number of pixel values representing black in the window data is large, aggregating the window data into an open pixel, wherein the open pixel represents one pixel and the pixel value of the pixel is the open pixel value; or when the number of the pixel values representing black in the window data is smaller, if the number of the pixel values representing black in the adjacent data of a certain side of the binary matrix of the window data is larger, the window data is aggregated into an open pixel, otherwise, the window data is an closed pixel, and the closed pixel represents a pixel and the pixel value of the pixel is an closed pixel value.
Here, the preset size may be determined in combination with actual requirements, and too large may cause distortion of an object to be displayed, and too small may increase the amount of calculation. For example, the preset size may be 2 × 2. The preset sequence is usually from left to right and then from top to bottom, or from top to bottom and then from left to right, the step size in the row direction may be 2, and the step size in the column direction may be 2.
Illustratively, when the size of the window is 2 × 2, an on pixel represents a black pixel, and an off pixel represents a white pixel; then, as shown in fig. 6a, the decision logic of the decision tree may be:
it is determined whether the window data includes 4 all black pixels. If yes, displaying that the window data are aggregated into black pixels. As shown in fig. 6b, the rectangular frames 2 are grouped into 1 black pixel.
If not, judging whether the window data contains 3 full black pixels. If yes, displaying, and aggregating the window data into black pixels. As shown in fig. 6b, the rectangular frames 8 are grouped into 1 black pixel.
If not, judging whether the window data contains 2 full black pixels. If not, not displaying, and aggregating the window data into black pixels. As shown in fig. 6b, the rectangular frames 5, 3 are grouped into 1 white pixel.
If so, determine if the right side contains 2 fully black pixels. If yes, displaying that the window data are aggregated into black pixels. As shown in fig. 6b, the rectangular frames 9 are grouped into 1 black pixel.
If not, whether the lower portion contains 2 fully black pixels is determined. If yes, displaying, and aggregating the window data into black pixels. As shown in fig. 6b, the rectangular frames 4 are grouped into 1 black pixel.
If not, the window data is not displayed, and the window data is aggregated into black pixels. As shown in fig. 5b, the rectangular frames 6, 7 are grouped into 1 black pixel.
To sum up, when the number of pixels representing black in the window data is greater than or equal to 3, aggregating the window data into black pixels; when the number of pixels representing black in the window data is less than 3, if 2 pixels adjacent to the right side of the window represent black, or if 2 pixels adjacent to the lower side of the window represent black, the window data is aggregated into black pixels.
Specifically, first information of the window data is counted according to a first vector [ a1, a2, a3, a4, a5, a6, a7, a8, a9], where a1 denotes the number of self-blackish pixels, a2 denotes the number of left-side blackish pixels, a3 denotes the number of right-side blackish pixels, a4 denotes the number of lower-side blackish pixels, a5 denotes the number of upper-side blackish pixels, a6 denotes whether the left-side two are completely blackened, a7 denotes whether the right-side two are completely blackened, a8 denotes whether the upper two are completely blackened, and a9 denotes whether the lower two are completely blackened. When the statistical result of a1 is more than or equal to 3, and when the statistical result of a1 is less than 3, and the statistical result of a6 or a8 is yes, the pixels are aggregated into black pixels; otherwise, the white pixels are aggregated.
In addition, in some possible cases, the electronic device 100 may display a first control image, and a user may operate on the first control image to change the pixel value of a pixel in the image, for example, clicking a black pixel changes the pixel into a white pixel; clicking the white pixel to change the white pixel; resulting in a final first control image.
According to the embodiment of the invention, the bitmap image is subjected to binarization and pixel aggregation, and the first control image which is adaptive to the size of the area of the object to be displayed on the dot matrix display screen is output.
Fig. 5b shows a schematic flowchart of a display control method according to an embodiment of the present invention.
As shown in fig. 5b, the object to be displayed is a character string, the character string includes a plurality of characters, the bitmap image includes bitmap images corresponding to the plurality of characters, and step 303 shown in fig. 3 at least includes the following steps:
step 3033, determining the size of the character; wherein the character size indicates the size of the area of the dot matrix display screen where the character is displayed.
Here, the character size is the size of the character. Here, the character size may be a default, or may be set by the user through the electronic device 100, and the character size may be understood as a first display size when the object to be displayed is a character, and the detailed description refers to the description of the first display size when the object to be displayed is a character.
Step 3034, based on the character size, processing the bitmap image corresponding to each of the plurality of characters respectively to obtain the character control image corresponding to each of the plurality of characters with the character size.
Specifically, for each character in the character string, based on the character size, the bitmap image corresponding to the character is processed to obtain a character control image adapted to the character size. When the bitmap image is not a binary image, replacing the first display size in the step 3031 and the step 3032 with a character size, and processing the bitmap image corresponding to the character to obtain a character control image; and when the bitmap image is a binary image, replacing the first display size in the step 3032 with a character size, and processing the bitmap image corresponding to the character to obtain a character control image.
Step 3035, splicing the character control images corresponding to the characters respectively, and determining a first control image which is adapted to the first display size of the character string.
Specifically, based on the number of characters in the character string (for convenience of description and distinction, referred to as the number of characters), the number of columns of the character string, the line space, and the column space, character control images corresponding to the plurality of characters are spliced, and a first control image adapted to the first display size of the character string can be obtained.
Here, the number of columns of the first display size is equal to or smaller than the number of columns of the second display size.
The number of columns of the first display size is the total number of columns of the character strings, and the total number of columns of the character strings is the product of the number of columns of the character size and the number of characters, and the column spacing between the characters of different columns is added.
The number of lines of the first display size is the total number of lines of the character string, which is the product of the number of lines of the character size and the serial number of the character, plus the line spacing between the characters of different lines. Here, the character serial number is determined based on a result of dividing the character string number by the character number, and if the result is an integer, the result is the character serial number; if the result is not an integer, then the integer of the result plus 1 is the number of strings.
It should be noted that the number of columns, the line spacing, and the column spacing of the character strings may be set in the electronic device 100 in advance by a person, or may be set by the user through the electronic device 100. It is to be noted that, if the user does not set the number of columns of the character string, the electronic device 100 may use the sum of the number of columns of the character and the column pitch divided by the number of columns of the second display size as the number of columns of the character string.
It should be noted that, when the object to be displayed is a character and the bitmap image of the character is not a binary image, the first display size in step 3031 and step 3032 is replaced with the character size, and the bitmap image corresponding to the character is processed to obtain a first control image; when the bitmap image is a binary image, the first display size in step 3032 is replaced by a character size, and the bitmap image corresponding to the character is processed to obtain a first control image.
In the scheme, the characters are controlled to be spliced, so that the layout of a plurality of characters in the character string can be flexibly designed, and various scene requirements can be met.
Fig. 7 is a flowchart illustrating a display control method according to an embodiment of the present invention.
As shown in fig. 7, after step 303 shown in fig. 3, at least the following steps are included:
step 701, when a plurality of objects to be displayed exist, determining position information corresponding to the plurality of objects to be displayed; the position information indicates an area where the first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen.
For example, if the plurality of objects to be displayed are a plurality of characters, the position information may be a character number, and the character number indicates a sequence between the plurality of characters.
For example, assuming that the plurality of objects to be displayed are character strings and images, the position information may be position coordinates obtained by mapping the first control images of the character strings and the images to the same coordinate system, for example, coordinates between 4 corner points of the character strings and 4 corner points of the images. Here, the coordinate system may be a coordinate system of the dot matrix display 202.
For example, assuming that the plurality of objects to be displayed are character strings and videos, the position information may be position coordinates at which the first control images of the character strings and the videos are mapped to the same coordinate system, for example, coordinates between 4 corner points of the character strings and 4 corner points of the videos. Here, the coordinate system may be a coordinate system of the dot matrix display screen 202, or may be a coordinate system of a display screen of the electronic device.
Step 702, sending the position information and the first control images corresponding to the plurality of objects to be displayed to the dot matrix display, and instructing the dot matrix display to display the plurality of objects to be displayed based on the position information and the first control images corresponding to the plurality of objects to be displayed.
Specifically, the dot matrix display may establish a correspondence between the first control image and the different light emitting devices for each of the plurality of objects to be displayed based on the position information of the plurality of objects to be displayed, and based on this, control the light emitting devices to emit colors to display the plurality of objects to be displayed.
In the embodiment of the present invention, the electronic device 100 generates the first control image adapted to the display size of the object to be displayed, and when there are a plurality of objects to be displayed, may determine respective position information of the plurality of first control images, and output the first control image and the position information to the dot matrix display to display the plurality of objects to be displayed.
Fig. 8 is a schematic flow chart illustrating a display control method according to an embodiment of the present invention.
As shown in fig. 8, after step 303 shown in fig. 3, at least the following steps are included:
step 801, when a plurality of objects to be displayed exist, determining position information corresponding to the plurality of objects to be displayed; the position information indicates an area where the first control image corresponding to each of the plurality of objects to be displayed is displayed on the dot-matrix display screen.
The details are described above with reference to step 701 and will not be repeated.
Step 802, based on the position information and the second display size of the dot matrix display, processing a first control image of each of the plurality of objects to be displayed, and determining a second control image adapted to the second display size of the dot matrix display.
Here, the second display size of the dot matrix display is the size of the dot matrix display screen 202 in the dot matrix display 200, which is represented by the number of rows and columns of the dot matrix display screen 202.
Specifically, a blank image of a second display size of the dot matrix display is determined, and a first control image corresponding to each of the plurality of images to be displayed is filled in the blank image based on the position information corresponding to the plurality of objects to be displayed, so that a second control image is obtained.
And 803, sending the second controller image to the dot matrix display, and instructing the dot matrix display to display a plurality of objects to be displayed based on the second control image.
Specifically, the dot matrix display may establish a correspondence between the second control image and different light emitting devices, based on which the light emitting devices are controlled to emit colors, displaying a plurality of objects to be displayed.
In the embodiment of the present invention, the electronic device 100 generates a first control image adapted to the display size of the object to be displayed, and when there are a plurality of objects to be displayed, may determine respective position information of the plurality of objects to be displayed, and convert the plurality of first control images into a second control image adapted to the display size of the dot matrix display based on the position information, and output the second control image to the dot matrix display to display the plurality of objects to be displayed.
The following describes an actual application scenario of the display control method provided by the embodiment of the present invention.
In a specific implementation, the electronic device 100 displays a first page, a user may operate the first page to obtain operation information of the user, and the electronic device 100 may obtain first data based on the operation information of the user, may also obtain a first display size, position information of a plurality of objects to be displayed, a first control image, and a second control image.
As shown in fig. 9a, the first page includes a dot matrix display selection area 901, a plan area 902, an address input box 903 for an image, an image rendering button 904, an image thumbnail area 905, an image size input box 906, an address input box 907 for a video, a video rendering button 908, a video thumbnail area 909, a video size input box 910, an input box 911 for a character string, a font 912, a character size input box, an input box 913 for a canvas size, a preview button 915, a rendering button 916, a stitching button 917, a preview list 918, and a preview area 919.
Optionally, the dot matrix display selection area 901 has an input box, and a user may input an address of the dot matrix display in the input box, or may click the input box to present a drop-down list of the dot matrix display for selecting the dot matrix display in the list.
Optionally, the solution area 902 displays a plurality of solutions for processing the bitmap image to obtain the first control image. FIG. 9a shows scheme 1, scheme 2, scheme 3, \8230. Scheme 1 may be the decision tree pixel aggregation scheme described above, and scheme 2 may be a scheme in which the first control image is manually operated. Although the embodiment of the present invention only shows two schemes, in practical application, different schemes can be designed according to different requirements. In addition, these schemes can be classified into a text scheme, an image scheme, and a video scheme.
Alternatively, the user may input the address of the image in the input box 903 of the image address. The image address can be a local image address or a remote image address. In this way, the electronic device 100 can obtain an identification of the image.
Alternatively, the user may click on the image submit button 904, jump to the page for image selection, and select an image. Thereafter, the electronic device 100 may display a thumbnail of the uploaded picture in the image thumbnail area 905.
Alternatively, the user may enter the image size 906 in an image size entry box, and if the user does not enter the image size, the image size is a default size, which is the size of the dot matrix display screen 202 in the dot matrix display 200 selected by the user.
Alternatively, the user may input the address of the image at the input box 907 of the video address. The video address can be a local video address or a remote video address. In this way, the electronic device 100 may obtain an identification of the video.
Alternatively, the user may click on the video submit button 908, jump to the video selection page, and select the video. Thereafter, the electronic apparatus 100 may display a thumbnail of the uploaded video in the video thumbnail area 909.
Alternatively, the user may enter the video size in the video size entry box 910, and if the user does not enter the video size, the video size is the default size, which is the size of the dot matrix display screen 202 in the dot matrix display selected by the user. Here, the video size may be understood as the size of images in the video, and the size of each image in the video is the same.
Alternatively, the user may input a character string at the input box 911 of the character string. In this way, the electronic device 100 may obtain an identification of the character string, or alternatively, a character identification of each character in the character string. Here, the identification of the character string is composed of a character identification of each character in the character string.
Alternatively, the user can click on font 912, display a list of fonts, and the user can select a font in the list of fonts. If the user does not select the font, the font is the default font, for example, microsoft yablackish can be used.
Alternatively, the user may input a character size in the character size input box 913, and if the user does not input the character size, the character size is a default size, and the default size of the character is the size of the dot matrix display screen 202 in the dot matrix display selected by the user; in practical application, an input box for the column number, an input box for the column spacing and an input box for the line spacing of the character strings can be further arranged, so that the column number, the column spacing and the line spacing of the character strings are obtained. Further, the character serial number, the column pitch, and the line pitch may be obtained by providing an input frame for the character serial number, an input frame for the column pitch, and an input frame for the line pitch.
Optionally, the user may enter the canvas size in the canvas size entry box 914, if the user does not enter the canvas size, the canvas size is a default size, which may be determined in conjunction with actual requirements. The canvas size is typically larger than the size of the dot matrix display screen 202 in the dot matrix display 200 selected by the user.
Alternatively, the user may click on the preview button 915 to display a first control image corresponding to any one of the image address indication image input by the user, the uploaded image, the video indicated by the video address, the uploaded video, and the character string in the preview area 919. The preview area 919 displays a single first control image, or a video formed of a plurality of first control images; it is to be noted that, when the user displays any of the image indicated by the image address input by the user, the uploaded image, the video indicated by the video address, the uploaded video, and the character string, the preview list 918 displays thumbnails of the first control images each displayed in the preview area 919 for the plurality of contents.
It is to be noted that, alternatively, when the first control image represents a character string, at this time, the first control image may be interpreted as a text box, and the user may manipulate the length and width of the first control image and the position of the characters of each line in the preview area 919, thereby automatically adjusting the layout of the characters in the character string without changing the size of the characters and the order of the characters.
Further, when the submit button 916 is clicked, the first control image currently displayed in the preview area 919 may be uploaded to the dot matrix display selected by the user. Note that the thumbnail may be a thumbnail of an image or a thumbnail of a video. The following is a modification of a first control image sequence of a video. The method comprises the following specific steps:
the preview area 919 may provide a modification button when the video is displayed, and may be clicked to pause the video playback if the user desires to modify the first control image of the video, and then click the modification button, the preview area displays a left shift button and a right shift button, the user clicks the left shift button, the preview area displays the first control image of the previous frame of the first control image currently displayed, the user clicks the right shift button, and the preview area displays the first control image of the next frame of the first control image currently displayed. The user can click the first control image currently displayed in the preview area to change the pixel value of the pixel; after the modification is complete, the user may click the modify button again, clicking on the first control image to resume video playback.
Optionally, the preview list 918 is used to display thumbnails of the previewed first control images. The user may click on a thumbnail in the preview list 918 and display a first control image corresponding to the thumbnail in the preview area 919. The user may also delete thumbnails in the preview list 918.
Optionally, the user may click the join button 914 to jump to a second page as shown in fig. 9b, where the second page is provided with a combination area 920, a preview button 921, a preview area 922, and a submit button 923, the user may drag thumbnails in the preview list 918 to the combination area 920 for combination, the user may click the preview button 921, an image corresponding to the combined preview thumbnail may be displayed in the preview area 922, the image may be a second control image or a first control image corresponding to each of a plurality of objects to be displayed, and in addition, the user may operate the image displayed in the preview area 922 to change a pixel value of a pixel in the image. When the user clicks the submit button 923, the image displayed in the preview area 922 may be uploaded to the dot matrix display selected by the user. Note that optionally, the preview area 922 is adapted to the size of the dot matrix display 202; therefore, the coordinate information of the first control image of each of the plurality of objects to be displayed in the preview area 922 is the above-mentioned position information. Optionally, the size of the preview area 922 is N times the size of the dot matrix display 202, for example, the number of rows of the preview area 922 is N times the number of rows of the dot matrix display 202, and the number of columns of the preview area 922 is N times the number of columns of the dot matrix display 202; therefore, the coordinate N of the first control image of each of the plurality of objects to be displayed in the preview area 922 is the above-mentioned position information.
It should be noted that, the first page and the second page are only used as examples, and in practical applications, the content and the layout of the first page and the second page may be reasonably designed according to practical requirements. For example, an input selection box is displayed on the first page, the user clicks the input selection box, a list is displayed, the list includes a character string, an image and a video, the user clicks a character, an input box 911 capable of displaying the character string, a font 912, and a character size input box 913 are displayed; an address input box 903 in which an image can be displayed by clicking the image, an image submit button 909, an image thumbnail region 905, an input box 909 of an image size; clicking on the video, an address input box 909, a video submit button 908, a video thumbnail region 909, and a video size input box 910, which can display the above video, are displayed.
Based on this, in one possible scenario, the user may display the user input character string through the electronic device 100, and the electronic device 100 generates a bitmap image of the character string in the process of displaying the character string. In order to reduce the data processing amount, the bitmap image of the object to be displayed according to the embodiment of the present invention may be a bitmap image generated during the process of displaying the character string by the electronic device 100.
In one possible scenario, a user may display a user input string via electronic device 100, and electronic device 100 generates a bitmap image of the string during display of the string. Considering that the size of the bitmap image generated during the display of the character string by the electronic device 100 may be too large or too small, the electronic device 100 may invoke the rendering module described in step 302 above to regenerate the bitmap image of the character string based on the relevant information (generally, the identifier of the character string, and the stored data) obtained during the display of the character string.
In yet another possible scenario, a user may input an image address, a video address, an uploaded image, or an uploaded video through the electronic device 100, and accordingly, the electronic device 100 may obtain an identification of the image or the video. The bitmap image of the object to be displayed according to the embodiment of the present invention may be the bitmap image generated by the electronic device 100 invoking the rendering module based on the identifier of the image or the video.
Next, a detailed example description is given of a display control method provided by an embodiment of the present invention, with reference to fig. 2, fig. 9a, and fig. 9 b. The embodiment of the invention mainly provides a display control method of character strings, images and videos.
Example 1, an object to be displayed is a character string, and fig. 10 is a flowchart illustrating a method for controlling display of a character string according to an embodiment of the present invention. As shown in fig. 7, the display control method of character strings includes the following steps.
Step 1001, the electronic device 100 displays a first page.
Details of the first page are provided with reference to fig. 9a and the above description of fig. 9 a.
According to one possible implementation, the client 101 is installed on the electronic device 100, and the user accesses the client 101 through the electronic device 100 to display the first page.
According to a possible implementation manner, the electronic device 100 is equipped with a browser, and a user can access the client 101 through the browser, and the client 101 displays a first page; for example, a web address is entered in a browser to access the client 101.
For example, when accessing the client 101 for the first time, the user first enters a login page of the client, operates the login page to register an account, and manually sets an account password to obtain an account and an account password that can access the client 101; then, the user inputs an account and an account password on the login page, and can log in the client 101 to access and use various services provided by the client 101. Of course, the user may repeatedly log in to the client 101 through an account and an account password.
In step 1002, the electronic device 100 determines, based on the operation of the user on the first page, a character string identifier, a character size, a character string column number, a column pitch, and a line pitch corresponding to the information input by the user on the first page.
Here, the operation information in step 3011 indicates the operation of the first page by the user.
The character string identifier includes a character identifier (i.e., the first identifier described in step 301) of each character in the character string, and the electronic device 100 may specifically determine the character string identifier through the following two implementation manners.
According to one possible implementation, the user may input the character string desired to be displayed through a mechanical keyboard or virtual keys of the electronic device 100, thereby causing the electronic device 100 to obtain the character string identification.
According to another possible implementation, the microphone of the electronic device 100 collects the voice signal of the user and recognizes the voice signal to obtain the recognized character string identifier.
In some possible implementations, the function of the new page is set in the electronic device 100, for example, the operation of triggering the function of the new page by the user is obtained in the form of setting a button, receiving voice, receiving a gesture instruction, or the like, and is not limited to the illustrated example. The electronic device 100 displays the page of the acquired object to be displayed, for example, in a form of a pop-up window, in response to the user triggering the function of adding a new page. An input box can be set in the page of the object to be displayed, or user input information can be acquired in the form of voice and the like, so that the character string identification can be acquired.
The electronic device 100 may specifically determine the character size in two implementations as follows.
According to one possible implementation, a user may enter a desired character size through a mechanical keyboard or virtual keys of the electronic device 100, thereby causing the electronic device 100 to obtain the character size.
According to one possible implementation, the electronic device 100 may use a default character size as the character size without the user entering the desired character size through the electronic device 100.
In practical application, the input boxes of the column number, the column spacing and the row spacing of the character string can be used. Alternatively, the default string column number, column spacing, and row spacing of the electronic device 100 are directly used. Illustratively, the column pitch and the row pitch may be 0.
In step 1003, the electronic device 100 displays the character string input by the user on the first page based on the character string identifier.
The character string is displayed in the input box of the character string in the first page shown in fig. 9 a.
Step 1004, the electronic device 100 generates and displays a first control image corresponding to the character string identifier based on the preview operation, the character size, the number of character string columns, the column spacing and the line spacing of the user on the first page.
Here, the preview operation may be that the user clicks a preview button 915 on the first page shown in fig. 9 a.
Specifically, after the preview operation of the user is acquired, firstly, the character string identifier is decomposed to obtain a character identifier array.
Specifically, in the embodiment of the present invention, when the obtained character string identifier input by the user is composed of m character identifiers, m.split (') may be used to decompose the character string identifier m, and finally, m is decomposed into several character identifiers, and all the character identifiers obtained through decomposition constitute a character identifier array, and the character identifier array corresponds to the character string identifier. For example, when the character string represented by the bullet screen character string identifier to be displayed is "please turn right", the character array represented by the character identifier array obtained by decomposing the bullet screen character string identifier to be displayed is [ 'please', 'right', 'turn' ]. It should be noted that the character is identified as a character code and a font.
Next, the electronic device 100 generates a rendering request by using each character identifier, the brush color, the identifier of the rendering module, the character size, the column number of the character string, the column interval, and the line interval in the character identifier array as parameter values of the input parameters, and invokes the rendering module of the operating system to generate and display the first control image based on the rendering request. Illustratively, the preview area 919 on the first page shown in FIG. 9a displays a first control image. Specifically, a character control image of each character in the character string may be generated based on the character size, and then the character control images may be spliced according to the number of columns, the column pitch, and the line pitch of the character string to obtain the first control image. Here, the detailed process of generating the first control image is as described above for steps 3033 to 3034.
In step 1005, the electronic device 100 obtains the modification operation on the first control image to obtain the updated first control image.
Here, the user clicks on the first control image displayed in the preview area 919 on the first page shown in fig. 9a, thereby changing black pixels to white pixels and white pixels to black pixels.
In addition, the number of characters in rows and columns of the first control image can be adjusted, and further, the distance between rows and the position of each row of characters can be changed by a user, wherein the characters are centered, close to the left, close to the right and the like.
In step 1006, the electronic device 100 converts the updated first control image into a second control image based on the submission operation of the user on the first page.
Specifically, the electronic device 100 determines a blank image of the second display size of the dot matrix display, and fills the updated first control image to a middle position of the blank image to obtain a second control image.
Step 1007, the electronic device 100 sends a second control image to the dot matrix display 200.
At step 1008, the dot matrix display 200 displays the character string input by the user based on the second control image.
Here, the submit operation may be for the user to click the submit button 916 on the first page shown in fig. 9 a.
It should be noted that the conversion into the second control image in steps 1003, 1005 and 1007 is an optional step.
In the scheme, the electronic equipment generates the bitmap image of the character string, so that the dot matrix controller does not need to carry a specific dot matrix word stock, and the quick conversion and output of the characters can be realized.
Fig. 11 is a flowchart illustrating a display control of an image according to an embodiment of the present invention. As shown in fig. 11, the display control scheme of an image includes the following steps.
In step 1101, the electronic device 100 displays a first page.
Step 1102, the electronic device 100 determines an image identifier and an image size based on a user operation on the first page.
Here, the image identifier is a storage address of the local image or a storage address of the remote image.
Specifically, the user inputs the storage address of the remote image in the input box 903 of the image address on the first page shown in fig. 9a, thereby causing the electronic apparatus 100 to acquire the image identification. Alternatively, the user clicks the image upload button 904 on the first page shown in fig. 9a to upload the local image, thereby causing the electronic apparatus 100 to acquire the image identification.
Specifically, the user inputs the image size in the image size input box 904 on the first page shown in fig. 9a, thereby causing the electronic apparatus 100 to acquire the image identification. If the user does not input the image size, the image size is a default size, such as the second display size of the dot matrix display 200.
Step 1103, the electronic device 100 displays the image thumbnail on the first page based on the image identifier.
Specifically, an image thumbnail area 605 on the first page shown in fig. 6a displays an image thumbnail.
In step 1104, the electronic device 100 generates a bitmap image corresponding to the image identifier based on a preview operation of the user on the first page.
Here, the user clicks the preview button 914 on the first page shown in FIG. 9a, thereby triggering interaction with the rendering module.
Specifically, the electronic device 100 generates a rendering request by using the image identifier, the identifier of the rendering module, and the canvas size as parameter values of the input parameters, so as to invoke the rendering module to generate the bitmap image.
In step 1105, the electronic device 100 determines and displays a first control image based on the bitmap image and the image size.
For the details of obtaining the first control image with the adaptive image size from the bitmap image, see the description of step 3032 above, the first display size may be replaced by the image size, and details are not repeated.
Here, the preview area 919 on the first page shown in fig. 9a displays a first control image adapted to the image size.
In step 1106, the electronic device 100 obtains a modification operation on the first control image to obtain an updated first control image.
Here, the user clicks on the control image displayed in the preview area 919 on the first page shown in fig. 9a, thereby changing black pixels to white pixels and white pixels to black pixels.
In step 1107, the electronic device 100 converts the updated first control image into a second control image based on the submission operation of the user on the first page.
At step 1108, the electronic device 100 sends a second control image to the dot matrix display 200.
In step 1109, the dot matrix display 200 displays an image based on the second control image.
It is noted that the conversion into the second control image in the above steps 1103, 1106 and 1107 are optional steps.
In the scheme, the client side generates the bitmap image by depending on the operating system, and can realize the quick conversion and output of the image without carrying a specific dot matrix font library.
Fig. 12 is a schematic flowchart of video display control according to an embodiment of the present invention. As shown in fig. 12, the display control scheme of the video includes the following steps.
In step 1201, the electronic device 100 displays a first page.
Step 1202, the electronic device 100 determines a video identifier and a video size based on the operation of the user on the first page.
Here, the video identifier is a storage address of the local video or a storage address of the remote video.
Specifically, the user inputs the storage address of the remote video in the input box 903 of the video address on the first page shown in fig. 9a, so that the electronic apparatus 100 acquires the video identification. Alternatively, the user clicks the video upload button 904 on the first page shown in fig. 9a to upload the local video, so that the electronic device 100 acquires the video identifier.
Specifically, the user inputs a video size in the video size input box 904 on the first page shown in fig. 9a, thereby causing the electronic apparatus 100 to acquire the video size.
Step 1203, the electronic device 100 generates a bitmap image sequence corresponding to the video identifier based on a submission operation of the user on the first page.
Specifically, the client 101 generates a rendering request by using the video identifier and the identifier of the rendering module as parameter values of the input parameters.
At step 1204, the electronic device 100 determines a first sequence of control images based on the sequence of bitmap images and the video size.
For each bitmap image in the bitmap image sequence, the first control image adapted to the video size is obtained based on the bitmap image, and for the detailed content, referring to the description of step 3032 above, the first display size may be replaced by the video size, which is not described again. After each bitmap image in the bitmap image sequence is completed, a respective first control image of each bitmap image can be obtained, and further a first control image sequence is obtained.
Step 1205, the electronic device 100 obtains a modification operation on the first control image in the first control image sequence to obtain an updated first control image sequence.
In step 1206, the electronic device 100 converts the updated first control image sequence into a second control image sequence based on the submission operation of the user on the first page.
Step 1207, the electronic device 100 sends a second sequence of control images to the dot matrix display 200.
In step 1208, the dot matrix display 200 displays the video based on the second control image sequence.
It is noted that the conversion into the second control image in the above steps 1205, 1206 is an optional step.
In the scheme, the client side generates the bitmap image sequence by depending on the operating system, and can realize the quick conversion and output of the video without carrying a specific dot matrix word stock.
It is to be noted that, when there are a plurality of objects to be displayed, if the method shown in fig. 7 is adopted, the plurality of objects to be displayed may be composed of characters and/or images, such as a character, a character string, and an image.
For example, when a plurality of objects to be displayed are formed by character strings and images, the first control images corresponding to the character strings and the images may be determined based on the manner shown in fig. 10 and 11, and then the electronic apparatus 100 displays preview thumbnails of the first control images corresponding to the character strings and the images, and the user operates the preview thumbnails to determine position information of the areas displayed in the dot matrix display 202 by the first control images corresponding to the character strings and the images. Then, the electronic device 100 splices the first control images of the character string and the image into a second control image based on the first control images and the position information of the character string and the image, and sends the second control image to the dot matrix display 200, so that the dot matrix display 200 displays the character string and the image.
It is to be noted that, when combining the character string and the image, the character size of each character in the character string is not changed, but the area occupied by the character string may be changed, thereby adaptively changing the first control image. For example, the length and width of the preview thumbnail of the character string may be adjusted to change the area occupied by the character string. Specifically, if the character string is arranged beside the image, the length and width of the character string can be adjusted, the area occupied by the character string is changed, the layout of the character string and the image is more reasonable, and the adjustment of the character string can be referred to the adjustment of the text box in the prior art.
For example, when the plurality of objects to be displayed are formed by a plurality of character strings, the first control images corresponding to the plurality of character strings may be determined based on the manner shown in fig. 10, and then the electronic apparatus 100 displays preview thumbnails of the first control images corresponding to the plurality of character strings, and the user operates the preview thumbnails to update the first control images corresponding to the plurality of character strings and determine the position information thereof. Then, the electronic device 100 splices the plurality of character strings and the respective first control images of the images into a second control image based on the respective first control images and the position information of the plurality of character strings, and sends the second control image to the dot matrix display 200, so that the dot matrix display 200 displays the plurality of character strings.
Corresponding to the description of fig. 8, the plurality of objects to be displayed may be composed of characters, images, videos and/or character strings, for example, a plurality of character strings, character strings and images, character strings and videos, images and videos.
For example, when the plurality of objects to be displayed are composed of character strings, images, and videos, the first control images corresponding to the plurality of objects to be displayed may be determined based on the manners shown in fig. 10, 11, and 12, and then the electronic device 100 displays preview thumbnails of the first control images corresponding to the plurality of objects to be displayed, and the user operates the preview thumbnails to determine the position information of the first control images corresponding to the plurality of objects to be displayed. After that, the electronic device 100 transmits the first control image and the position information corresponding to each of the plurality of objects to be displayed to the dot matrix display 200, so that the dot matrix display 200 displays the character string, the image, and the video. It should be noted that, if the image size, the character size, and the video size set by the user are not appropriate, the preview thumbnail may be adaptively adjusted after being operated to adapt to the second display size of the dot matrix display 202.
In practical applications, as shown in fig. 9b, a plurality of preview thumbnails are displayed in the preview list, and a user may drag the preview thumbnails into the combination area 920 for combination, and then click the preview button 921, so as to display a second control image or a first control image corresponding to each of a plurality of objects to be displayed in the preview area 922.
Fig. 13 is a schematic structural diagram of an embodiment of an electronic device provided in the present invention. As shown in fig. 13, the electronic apparatus 100 in the present embodiment includes: the device comprises a processor 111, a memory 112, a nonvolatile memory 1121, a random access memory 1122, an antenna 1, an antenna 2, a communication module 113, an audio module 114, a speaker 114A, a receiver 114B, a microphone 114C, an earphone interface 114D, a display screen 115, a sensor module 116, a button 117 and a mouse 118.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the invention, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Some constituent components of the electronic device 100 will be described in detail below with reference to fig. 13.
The processor 111 may include one or more processors, for example, the processor 111 may include one or more of an Application Processor (AP), a modem (modem), a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processors may be separate devices or may be integrated into one or more processors. For example, the processor 111 may process the content required to be displayed on the display window of the application program on the electronic device 100. For example, the controller may generate operation control signals to perform control of instructions and execution of instructions based on instruction operation codes and timing signals.
In one example, a memory may also be provided in processor 111 for storing instructions and data. In some examples, the memory in processor 111 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 111. If the processor 111 needs to reuse the instruction or data, it can be called directly from the memory to avoid repeated access, reduce the waiting time of the processor 111, and improve the efficiency of the system.
The memory 112 may include internal memory for storing computer-executable program code, including instructions. The processor 111 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory. The internal memory may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, a client, and the like. The storage data area may store data (such as audio signals, phone book, etc.) created during use of the electronic device 100, and the like. Further, the internal memory can include non-volatile memory 1121, such as at least one disk storage device, flash memory devices, universal Flash Storage (UFS), etc., and random access memory 1122, which is commonly referred to as memory. In the embodiment of the present invention, the non-volatile memory 1121 may store the image file, the character set, the word stock, the operating system, and the client; in particular operation, an image file, a character set, dot or vector data in a word library, an operating system, and a client may be loaded into random access memory 1122. In addition, the client may provide the first page shown in fig. 9a and the second page shown in fig. 9 b.
In one example, the memory 112 further includes an external memory card, such as a Micro SD card, connected via an external memory interface to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 111 through an external memory interface to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the communication module 113, a modem, a baseband processor, and the like. The communication module comprises a wireless communication module and a wired communication module.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other examples, the antenna may be used in conjunction with a tuning switch.
The mobile communication module may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the electronic device 100. The mobile communication module may include at least one filter, switch, power amplifier, low Noise Amplifier (LNA), and the like. The mobile communication module can receive electromagnetic waves by at least two antennas including the antenna 1, filter and amplify the received electromagnetic waves, and transmit the electromagnetic waves to the modem for demodulation. The mobile communication module can also amplify the signal modulated by the modem and convert the signal into electromagnetic wave to radiate the electromagnetic wave through the antenna 1. In some examples, at least some of the functional modules of the mobile communication module may be provided in the processor 111. In some examples, at least some of the functional modules of the mobile communication module may be provided in the same device as at least some of the modules of the processor 111.
The wireless communication module may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module may be one or more devices integrating at least one communication processing module. The wireless communication module receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 111. The wireless communication module may also receive a signal to be transmitted from the processor 111, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
The electronic device 100 may implement audio functions via the audio module 114, the speaker 114A, the receiver 114B, the microphone 114C, the headphone interface 114D, and the application processor. Such as music playing, recording, etc.
The audio module 114 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 114 may also be used to encode and decode audio signals. In some examples, the audio module 114 may be disposed in the processor 111, or some functional modules of the audio module 114 may be disposed in the processor 111.
The speaker 114A, also called a "horn", is used to convert electrical audio signals into sound signals. The electronic apparatus 100 can listen to music through the speaker 114A or listen to a handsfree call.
The receiver 114B, also called "earpiece", is used to convert the electrical audio signal into a sound signal. When the electronic device 100 receives a call or voice information, it can receive the voice by placing the receiver 114B close to the ear of the person.
The microphone 114C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can input a voice signal into the microphone 114C by speaking into his/her mouth near the microphone 114C. The electronic device 100 may be provided with at least one microphone 114C. In other examples, the electronic device 100 may be provided with two microphones 114C to implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 114C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 114D is used to connect wired headphones. The earphone interface 114D may be a USB interface 130, or may be a 3.5mm open mobile platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The electronic device 100 implements display functions via the GPU, the display screen 115, the memory 112, the digital-to-analog converter, and the application processor. The GPU is a microprocessor for image processing, connected to the display screen 115 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 111 may include one or more GPUs that execute program instructions to generate or alter display information. The memory 112 includes a video memory for storing GPU-processed data or application-processor-processed data representing information for each pixel that can be output to the display screen 115; the video memory may be independent and does not occupy the RAM 1122; a portion of the ram 1122 may be shared as a video memory; or may be separate video memories and portions of shared random access memory 1122. The digital-to-analog converter is used to convert the video memory data into analog signals, and the display screen 115 is used to implement display based on the analog information. Notably, the display screen 115 may display a first page shown in fig. 9a and a second page shown in fig. 9b provided by the client.
Specifically, the display screen 115 is used to display images, videos, and the like. The display screen 115 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), and the like. In some examples, electronic device 100 may include one or more display screens 115. In one example, the display screen 115 may be used to display an interface for an application, display a display window for an application, and the like.
The sensor module 116 may include a camera, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like. Here, the touch sensor is also referred to as a "touch panel". The touch sensor may be disposed on the display screen 115, and the touch sensor and the display screen 115 form a touch screen, which is also called a "touch screen". The touch sensor is used to detect touch operation data acting thereon or nearby. The touch sensor may communicate the detected touch operation data to the application processor to determine the touch event type. Visual output related to the touch operation data may be provided through the display screen 115.
The electronic device 100 may further include an external input device such as a button 117 and a mouse 118. The keys 117 include a power-on key, a volume key, an input keyboard, and the like. The keys may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
It is noted that here the electronic device 100 has a display screen 115. Alternatively, the display screen 115 may be directly operated, and specifically, the electronic device may be a mobile phone, a tablet, or the like. Alternatively, an external operation device such as a keyboard 117 and a mouse 118 is connected to the electronic device 100, and data is input through the keyboard 117 and the content displayed on the display screen 115 is operated through the mouse 118. Specifically, the electronic device 100 may be a terminal device such as a notebook or a desktop.
Notably, the display screen 115 and the dot matrix display screen 202 of the electronic device 100 have a large difference. The display screen 115 of the electronic device 100 and the display unit in the dot matrix display screen 202 have a larger difference in resolution under the same matrix size, the resolution of the display screen 115 of the electronic device 100 is relatively higher, and the displayed content becomes clearer as the resolution is higher. Therefore, the display principle of the electronic device 100 and the display principle of the dot matrix display 200 have a great difference, and the embodiment of the present invention obtains the control image adapted to the dot matrix display 200 by using the image processing function and the rendering function of the electronic device 100, so that the display content of the dot matrix display screen 202 in the dot matrix display 200 can be flexibly controlled, and is not limited by the resources of the dot matrix display 200.
In addition to the above-described method and electronic device, embodiments of the invention may also provide a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in the display control method according to various embodiments of the invention described in the above-mentioned "method" section of this specification. Wherein the computer program product may write computer program code for carrying out operations for embodiments of the present invention in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. Wherein the computer program code may be in source code form, object code form, an executable file or some intermediate form, etc. The computer program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present invention may also provide a computer-readable storage medium having stored thereon computer program instructions, which, when executed by a processor, cause the processor to perform the steps in the display control method according to various embodiments of the present invention described in the above-mentioned "method" section of this specification. The computer readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
The basic principles of the present invention have been described above with reference to specific embodiments, but it should be noted that the advantages, effects, etc. mentioned in the present invention are only examples and are not limiting, and the advantages, effects, etc. must not be considered to be possessed by various embodiments of the present invention. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the invention is not limited to the specific details described above.
The block diagrams of devices, apparatuses, systems involved in the present invention are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably herein. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the apparatus, devices and methods of the present invention, the components or steps may be broken down and/or re-combined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the invention to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.
It is to be understood that various numerical references referred to in the embodiments of the present invention are only for convenience of description and distinction, and are not intended to limit the scope of the embodiments of the present invention.

Claims (10)

1. A display control method, characterized in that the method comprises:
the electronic equipment determines first data, wherein the first data is used for indicating an object to be displayed;
the electronic equipment generates a bitmap image of the object to be displayed based on the first data;
the electronic device determining, based on the bitmap image, a first control image adapted for a first display size; the pixel value of each pixel in the first control image indicates the display color value of a light-emitting device in a dot matrix display, the dot matrix display comprises a dot matrix display screen formed by a plurality of light-emitting devices, and the first display size indicates the size of an area on the dot matrix display screen, wherein the area is used for displaying the object to be displayed;
and the electronic equipment sends the data of the first control image to the dot matrix display, and instructs the dot matrix display to control the display color value of the light-emitting device based on the data of the first control image, so that the display of the object to be displayed is realized.
2. The method according to claim 1, wherein the first data is a first identifier of the object to be displayed, or the first data is description data of the object to be displayed.
3. The method according to claim 2, wherein the object to be displayed is a character, the first identifier is a character code and font, and the description data is a bitmap image or vector data representing the character; alternatively, the first and second electrodes may be,
the object to be displayed is an image, the first identifier is an image storage address, and the description data is an image file or data obtained by decoding the image file.
4. The method of claim 1, further comprising:
the electronic equipment acquires operation information of a user;
the electronic equipment determines the first data based on the operation information of the user.
5. The method of claim 1, wherein the object to be displayed is a character string, the character string comprises a plurality of characters, the bitmap image comprises a bitmap image corresponding to each of the plurality of characters, and the electronic device determines, based on the bitmap image, a first control image adapted to a first display size, comprising:
the electronic equipment determines the character size; wherein the character size indicates a size of an area on the dot matrix display screen where the character is displayed;
the electronic equipment respectively processes the bitmap images corresponding to the characters based on the character sizes to obtain character control images corresponding to the characters with the character sizes;
and the electronic equipment splices the character control images corresponding to the characters respectively and determines a first control image which is adaptive to a first display size of the character string.
6. The method of claim 1, wherein when the bitmap image is not a binary image, the electronic device determines, based on the bitmap image, a control image adapted for a first display size, comprising:
the electronic equipment binarizes the bitmap image to obtain a binary image;
and the electronic equipment scales the binary image based on a first display size to obtain a first control image adaptive to the first display size.
7. The method of claim 6, wherein the electronic device scaling the binary image to obtain the control image adapted to the first display size comprises:
the electronic equipment determines at least one decision tree of the object to be displayed; wherein the at least one decision tree is used for performing pixel aggregation on first window data of a first window on the binary image;
and the electronic equipment carries out pixel aggregation on first window data in the process that the first window traverses the binary image through the at least one decision tree to obtain the control image which is adaptive to the first display size.
8. The method of claim 1, further comprising:
when a plurality of objects to be displayed exist, the electronic equipment determines position information corresponding to the plurality of objects to be displayed; the position information indicates the area of the first control image corresponding to each of the plurality of objects to be displayed, which is displayed on the dot matrix display screen;
the electronic equipment sends the position information and first control images corresponding to the plurality of objects to be displayed to the dot matrix display, and instructs the dot matrix display to display the plurality of objects to be displayed based on the position information and the first control images corresponding to the plurality of objects to be displayed; alternatively, the first and second electrodes may be,
the electronic equipment processes the first control images corresponding to the objects to be displayed respectively based on the position information and a second display size of the dot matrix display, and determines second control images matched with the second display size;
the electronic device sends the second controller image to the dot matrix display, instructing the dot matrix display to display the plurality of objects to be displayed based on the second control image.
9. An electronic device, comprising:
at least one memory for storing a program;
at least one processor for executing the memory-stored program, the processor being configured to perform the method of any of claims 1-8 when the memory-stored program is executed.
10. A display system, characterized in that the system comprises an electronic device and a dot matrix display, wherein the electronic device is adapted to perform the method according to any of claims 1-8.
CN202211046940.5A 2022-08-30 2022-08-30 Display control method, electronic equipment and display system Pending CN115424569A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211046940.5A CN115424569A (en) 2022-08-30 2022-08-30 Display control method, electronic equipment and display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211046940.5A CN115424569A (en) 2022-08-30 2022-08-30 Display control method, electronic equipment and display system

Publications (1)

Publication Number Publication Date
CN115424569A true CN115424569A (en) 2022-12-02

Family

ID=84200268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211046940.5A Pending CN115424569A (en) 2022-08-30 2022-08-30 Display control method, electronic equipment and display system

Country Status (1)

Country Link
CN (1) CN115424569A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116033014A (en) * 2023-03-28 2023-04-28 深圳市智岩科技有限公司 Light-emitting data transmission method, light-emitting control device, medium and product

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116033014A (en) * 2023-03-28 2023-04-28 深圳市智岩科技有限公司 Light-emitting data transmission method, light-emitting control device, medium and product

Similar Documents

Publication Publication Date Title
CN106056530B (en) Method and device for displaying picture content in application
CN106776677B (en) File conversion method, device and file transmission system
CN111508039B (en) Word processing method of ink screen and communication terminal
CN107092684B (en) Image processing method and device, storage medium
JP4309270B2 (en) System and method for generating visual representations of graphic data and digital document processing
WO2020114215A1 (en) Image information processing method and apparatus, and storage medium
CN107770618B (en) Image processing method, device and storage medium
CN111899322B (en) Video processing method, animation rendering SDK, equipment and computer storage medium
KR100727066B1 (en) Method for providing mobile webpage by employing to dynamic template
WO2013110290A1 (en) Pattern matching engine
CN115424569A (en) Display control method, electronic equipment and display system
CN112114929A (en) Display apparatus and image display method thereof
CN109993817B (en) Animation realization method and terminal
US10074194B2 (en) Graphical object content rendition
CN112651475A (en) Two-dimensional code display method, device, equipment and medium
WO2022166619A1 (en) Qr code recognition method and related apparatus
JP2010191026A (en) Terminal outputting image data in accordance with external display device, program, and method
US20230342579A1 (en) Two-dimensional code generation method and related device
CN113645476B (en) Picture processing method and device, electronic equipment and storage medium
CN106569984A (en) Method, apparatus, and system for generating and transmitting color word stock automatically
CN113038141B (en) Video frame processing method and electronic equipment
WO2013020411A1 (en) Instant messaging terminal and method for displaying session message in real time
CN110647273B (en) Method, device, equipment and medium for self-defined typesetting and synthesizing long chart in application
WO2023071482A1 (en) Video editing method and electronic device
CN114513574B (en) Interface display method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination