CN108228074B - Display control method, display system, electronic device, and computer-readable medium - Google Patents

Display control method, display system, electronic device, and computer-readable medium Download PDF

Info

Publication number
CN108228074B
CN108228074B CN201810109012.6A CN201810109012A CN108228074B CN 108228074 B CN108228074 B CN 108228074B CN 201810109012 A CN201810109012 A CN 201810109012A CN 108228074 B CN108228074 B CN 108228074B
Authority
CN
China
Prior art keywords
image data
display
displayed
image
display parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810109012.6A
Other languages
Chinese (zh)
Other versions
CN108228074A (en
Inventor
周大凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201810109012.6A priority Critical patent/CN108228074B/en
Publication of CN108228074A publication Critical patent/CN108228074A/en
Application granted granted Critical
Publication of CN108228074B publication Critical patent/CN108228074B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The display control method comprises the steps of obtaining original image data of an image to be displayed as first image data, obtaining display parameters of the image to be displayed, processing the first image data based on the display parameters, obtaining processed image data as second image data, and controlling at least one display device to display based on the second image data. The present disclosure also provides an electronic device, a presentation system, and a computer-readable medium.

Description

Display control method, display system, electronic device, and computer-readable medium
Technical Field
The present disclosure relates to a presentation control method, a presentation system, an electronic device, and a computer-readable medium.
Background
With the development of electronic technology, the popularity of various electronic devices is increasing, and especially, the electronic devices with the display function can display rich contents for users, and are popular with users. However, in the course of implementing the inventive concept, the inventors found that there are at least the following problems in the prior art: in an electronic device having a display or capable of being externally connected with a display device, software designed by an original device manufacturer is required to manage the display function, and at the moment, if the display function is required to be actively managed, very complicated interaction with a black box is required, and the display function is basically in an out-of-control state.
Disclosure of Invention
One aspect of the disclosure provides a display control method, which includes acquiring original image data of an image to be displayed as first image data, acquiring display parameters of the image to be displayed, processing the first image data based on the display parameters, acquiring processed image data as second image data, and controlling at least one display device to display based on the second image data.
Optionally, the display parameter includes at least one of a display device for displaying the image to be displayed, a size and/or a position of the image to be displayed on the display area, or a shape of the image to be displayed on the display area.
Optionally, the method further includes acquiring touch operation data, processing the touch operation data based on the display parameter, and generating a touch instruction based on the processed touch operation data.
Optionally, in a case that there are multiple images to be displayed, the obtaining the display parameters of the images to be displayed includes obtaining display parameters corresponding to each image to be displayed in a part or all of the images to be displayed, the processing the first image data based on the display parameters, and obtaining processed image data as the second image data includes processing the first image data of the images to be displayed corresponding to the display parameters based on the display parameters, obtaining at least one processing result, and generating the processed image data as the second image data based on the at least one processing result.
Another aspect of the disclosure provides a display system including a first obtaining module, a second obtaining module, a first processing module, and a control module. The first acquisition module is used for acquiring original image data of an image to be displayed as first image data. And the second acquisition module is used for acquiring the display parameters of the image to be displayed. And the first processing module is used for processing the first image data based on the display parameters to obtain processed image data serving as second image data. And the control module is used for controlling at least one display device to display based on the second image data.
Optionally, the display parameter includes at least one of a display device for displaying the image to be displayed, a size and/or a position of the image to be displayed on the display area, or a shape of the image to be displayed on the display area.
Optionally, the system further includes a third obtaining module, a second processing module, and a generating module. And the third acquisition module is used for acquiring touch operation data. And the second processing module is used for processing the touch operation data based on the display parameters. And the generating module is used for generating a touch instruction based on the processed touch operation data.
Optionally, the second obtaining module includes an obtaining sub-module, configured to obtain a display parameter corresponding to each to-be-displayed image in part or all of the to-be-displayed images. The first processing module comprises a processing submodule and a generating submodule. And the processing submodule is used for processing the first image data of the image to be displayed corresponding to the display parameters based on the display parameters to obtain at least one processing result. A generating sub-module for generating the processed image data as second image data based on the at least one processing result.
Another aspect of the present disclosure provides an electronic device, including a processor and a memory, on which machine-readable instructions are stored, which when executed by the processor, cause the processor to perform obtaining original image data of an image to be displayed as first image data, obtaining display parameters of the image to be displayed, processing the first image data based on the display parameters, obtaining processed image data as second image data, and controlling at least one display device to display based on the second image data.
Optionally, the display parameter includes at least one of a display device for displaying the image to be displayed, a size and/or a position of the image to be displayed on the display area, or a shape of the image to be displayed on the display area.
Optionally, the processor further performs acquiring touch operation data, processing the touch operation data based on the display parameter, and generating a touch instruction based on the processed touch operation data.
Optionally, in a case that there are multiple images to be displayed, the obtaining the display parameters of the images to be displayed includes obtaining display parameters corresponding to each image to be displayed in a part or all of the images to be displayed, the processing the first image data based on the display parameters, and obtaining processed image data as the second image data includes processing the first image data of the images to be displayed corresponding to the display parameters based on the display parameters, obtaining at least one processing result, and generating the processed image data as the second image data based on the at least one processing result.
Another aspect of the disclosure provides a non-volatile storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1A and 1B schematically illustrate an application scenario of a presentation control method according to an embodiment of the present disclosure;
FIG. 2A schematically illustrates a flow chart of a presentation control method according to an embodiment of the present disclosure;
FIG. 2B schematically shows a structural diagram for implementing a presentation control method according to an embodiment of the present disclosure;
FIG. 3 schematically illustrates a flow chart of a presentation control method according to another embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart of a presentation control method according to another embodiment of the present disclosure;
FIG. 5 schematically illustrates a block diagram of a presentation system according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a block diagram of a presentation system according to another embodiment of the present disclosure;
FIG. 7 schematically illustrates a block diagram of a presentation system according to another embodiment of the present disclosure; and
fig. 8 schematically illustrates a block diagram of an electronic device of an application presentation control method or system according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "a or B" should be understood to include the possibility of "a" or "B", or "a and B".
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The embodiment of the disclosure provides a display control method, which includes acquiring original image data of an image to be displayed as first image data, acquiring display parameters of the image to be displayed, processing the first image data based on the display parameters, acquiring processed image data as second image data, and controlling at least one display device to display based on the second image data, so that the image to be displayed can be flexibly controlled to be displayed.
Fig. 1A and 1B schematically illustrate application scenarios of a presentation control method according to an embodiment of the present disclosure. It should be noted that fig. 1A and 1B are only examples of scenarios in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but do not mean that the embodiments of the present disclosure may not be used in other devices, systems, environments or scenarios.
As shown in fig. 1A, existing presentation systems 50 will by default use all of the presentation area 51 for presenting the image to be presented. When more than two images to be displayed are encountered, if the existing display system wants to simultaneously display the images to be displayed, the existing display system has to face a very complicated process of interacting with the operating system, and the operating system is basically in a black box state, so the process is basically in an out-of-control state.
As shown in fig. 1B, the display system 100 provided by the embodiment of the present disclosure can freely display an image to be displayed on a portion 120 of the display area 110. In this way, when there are a plurality of images to be displayed, the images to be displayed may be displayed at different positions of the display area 110 as needed.
Fig. 2A schematically illustrates a flow chart of a presentation control method according to an embodiment of the present disclosure.
As shown in fig. 2A, the method includes operations S210 to S240.
In operation S210, original image data of an image to be displayed is acquired as first image data.
In operation S220, display parameters of the image to be displayed are acquired.
In operation S230, the first image data is processed based on the presentation parameter, and the processed image data is obtained as second image data.
In operation S240, at least one presentation device is controlled to present based on the second image data.
According to the method, the display parameters of the image to be displayed are obtained, the original image data are processed based on the display parameters, and finally displayed second image data are generated, so that the display process of the image can be flexibly controlled.
According to the embodiments of the present disclosure, the display driving function can be designed into three parts. The display device comprises a first part, a second part and a third part, wherein the first part is used for completing a traditional display driving function and storing data of an image to be displayed as first image data, the second part is used for configuring display parameters, and the third part is used for reprocessing the first image data and the display parameters into second image data according to the first image data and the display parameters and storing the second image data into a cache of physical display equipment.
According to the embodiment of the present disclosure, the display parameter may include a display device for displaying the image to be displayed. For example, when the system is connected with a plurality of display devices, which display device is used for displaying a certain image to be displayed can be configured through the display parameters, so that the flexibility of display control is improved.
According to the embodiment of the present disclosure, the display parameter may include a size and/or a position of the image to be displayed on the display area. The position and the size of the image to be displayed in the display area can be determined by different display parameters, the image to be displayed can be positioned by the side or in the center, and the image to be displayed can be displayed locally or in a full screen mode, so that the control is convenient, and the multiple images can be displayed by using the same display device conveniently.
According to the embodiment of the disclosure, the display parameter may include a shape of the image to be displayed on the display area. For example, the original image data may be processed in a cropping or deforming manner to be displayed in the same or different shapes in the display area, so as to improve the variety of display.
The structure for implementing the presentation control method according to the embodiment of the present disclosure is described below with reference to fig. 2B.
As shown in fig. 2B, the structure includes caches 231, 232, and 233. Among other things, the cache 231 belongs to the first part for implementing the display driving function described above, and receives and stores original image data of an image to be presented, which is provided by, for example, the operating system 210. The cache 232 is part of the second portion described above for implementing display driver functionality, and receives and stores presentation parameters provided by, for example, the management software 220. The buffer 233 belongs to the third part for implementing the display driving function, and generates second image data after processing the image to be displayed in the buffer S231 according to the display parameter in the buffer 232, where the buffer 233 is used for storing the second image data.
In a typical structure, the operating system 210 directly stores the image to be displayed in the buffer 233, and therefore, if the display mode of the image to be displayed needs to be changed, the operating system needs to be operated in a complicated manner, and since the operating system is basically in a black box state, the control method is also basically in an out-of-control state.
In the embodiment of the present disclosure, a multi-buffer mode is adopted, and image processing, such as mosaic (image stitching and mosaic technology), is combined, so that the display process of the image to be displayed can be flexibly controlled.
According to the embodiment of the present disclosure, the caches 231, 232, and 233 may include a cache group composed of a plurality of caches for storing a plurality of information independently of each other. For example, caches 231, 232, and 233 may be a first cache set, a second cache set, and a third cache set, respectively.
According to the embodiment of the present disclosure, in operation S210, the original image data of the image to be displayed is obtained as the first image data, for example, the original image data of the image to be displayed may be obtained from the operating system 210 by the cache 231 as the first image data.
According to the embodiment of the present disclosure, in operation S220, the display parameter of the image to be displayed is obtained, for example, the display parameter provided by the management software 220 may be received and stored by the cache 232. The display parameters stored in the buffer 232 correspond to the image data stored in the buffer 231, and when there are a plurality of images to be displayed, there may be a plurality of sets of display parameters corresponding to the images to be displayed.
According to the embodiment of the present disclosure, in operation S230, the first image data is processed based on the presentation parameter, the processed image data is obtained as second image data, and the second image data is stored in, for example, the buffer 233.
According to the embodiment of the present disclosure, in operation S240, based on the second image data, at least one display device is controlled to display, for example, the second image data in the buffer 233 may be transferred into a display buffer of one or more display devices, so as to enable the one or more display devices to display images.
Fig. 3 schematically shows a flow chart of a presentation control method according to another embodiment of the present disclosure.
As shown in fig. 3, the method further includes operations S310 to S330 based on the foregoing embodiment.
In operation S310, touch operation data is acquired.
In operation S320, the touch operation data is processed based on the presentation parameter.
In operation S330, a touch instruction is generated according to the processed touch operation data.
For example, taking a rectangular display area as an example, the pixel coordinates on the display area are (x, y), where x ∈ [0, a ], y ∈ [0, b ]. The display parameter of an image to be displayed is x '0.5 x, and y' 0.5 y. Then, in the case of a user operating at a point (m, n) (where m is less than or equal to 0.5a, n is less than or equal to 0.5b), the method may determine that the operation acts on the image to be displayed based on the display parameter, process the touch operation data (m, n), obtain processed touch operation data (2m, 2n), and generate a touch instruction on the image to be displayed based on the processed touch operation data.
According to the method, under the condition that the image to be displayed is displayed according to the display parameters, the touch operation data can be restored according to the display parameters, so that a correct touch instruction can be generated.
Fig. 4 schematically shows a flow chart of a presentation control method according to another embodiment of the present disclosure.
As shown in fig. 4, the method includes S210, S240, and S410 to S430. Operations S210 and S240 are similar to those of the previous embodiments, and are not described herein again.
In operation S410, a display parameter corresponding to each of some or all of the images to be displayed is acquired.
In operation S420, first image data of an image to be displayed corresponding to the display parameter is processed based on the display parameter, and at least one processing result is obtained.
In operation S430, processed image data is generated as second image data based on the at least one processing result.
The method can process at least one image to be displayed based on the display parameters to generate second image data containing a plurality of images to be displayed. For example, by designing the presentation parameters, multiple images may be presented in the same or different sizes at different locations of the presentation area, one image may be overlaid on another image, and so forth.
Fig. 5 schematically illustrates a block diagram of a presentation system 500 according to an embodiment of the present disclosure.
As shown in fig. 5, the presentation system 500 includes a first acquisition module 510, a second acquisition module 520, a first processing module 530, and a control module 540.
The first obtaining module 510, for example, performs the operation S210 described above with reference to fig. 2, for obtaining original image data of an image to be shown as first image data.
The second obtaining module 520, for example, performs the operation S220 described above with reference to fig. 2, for obtaining the display parameters of the image to be displayed.
The first processing module 530, for example, performs the operation S230 described above with reference to fig. 2, and is configured to process the first image data based on the presentation parameter, and obtain the processed image data as the second image data.
The control module 540, for example, performs the operation S240 described above with reference to fig. 2, for controlling at least one presentation device to present based on the second image data.
According to the embodiment of the present disclosure, the display parameters include at least one of a display device for displaying the image to be displayed, a size and/or a position of the image to be displayed on the display area, or a shape of the image to be displayed on the display area.
Fig. 6 schematically illustrates a block diagram of a presentation system 600 according to another embodiment of the present disclosure.
As shown in fig. 6, the presentation system 600 further includes a third obtaining module 610, a second processing module 620, and a generating module 630 on the basis of the presentation system 500.
The third obtaining module 610, for example, performs the operation S310 described above with reference to fig. 3, for obtaining touch operation data.
The second processing module 620, for example, performs the operation S320 described above with reference to fig. 3, for processing the touch operation data based on the presentation parameter.
The generating module 630, for example, performs the operation S330 described above with reference to fig. 3, for generating a touch instruction based on the processed touch operation data.
Fig. 7 schematically illustrates a block diagram of a presentation system 700 according to another embodiment of the present disclosure.
As shown in fig. 7, the presentation system 700 is similar to the presentation system 500, wherein the second obtaining module 520 may further include a obtaining sub-module 710, and the first processing module 530 may further include a processing sub-module 720 and a generating sub-module 730.
The obtaining sub-module 710, for example, performs the operation S410 described above with reference to fig. 4, for obtaining the display parameters corresponding to each of part or all of the images to be displayed.
The processing sub-module 720, for example, performs the operation S420 described above with reference to fig. 4, and is configured to process the first image data of the image to be displayed corresponding to the display parameter based on the display parameter, so as to obtain at least one processing result.
The generating sub-module 730, for example, performs the operation S430 described above with reference to fig. 4, for generating the processed image data as the second image data based on the at least one processing result.
It is understood that the modules described above may be combined into one module, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present invention, at least one of the above modules may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in a suitable combination of three implementations of software, hardware, and firmware. Alternatively, at least one of the above modules may be implemented at least partly as a computer program module, which, when executed by a computer, may perform the functions of the respective module.
Fig. 8 schematically illustrates a block diagram of an electronic device of an application presentation control method or system according to an embodiment of the present disclosure.
As shown in fig. 8, the electronic device 800 includes a processor 810 and a computer-readable storage medium 820. the electronic device 800 may perform the method described above with reference to fig. 2A, 3 or 4 to achieve flexible control of presenting an image to be presented.
In particular, processor 810 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 810 may also include on-board memory for caching purposes. Processor 810 may be a single processing unit or a plurality of processing units for performing different actions of the method flows described with reference to fig. 2A, 3 or 4 in accordance with embodiments of the present disclosure.
Readable storage medium 820 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The readable storage medium 820 may include a computer program 821, which computer program 821 may include code/computer-executable instructions that, when executed by the processor 810, cause the processor 810 to perform a method flow, such as described above in connection with fig. 2A, 3, or 4, and any variations thereof.
The computer program 821 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 821 may include one or more program modules, including for example 821A, modules 821B, … …. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, which when executed by the processor 810, enable the processor 810 to perform the method flows described above in connection with fig. 2A, 3 or 4, for example, and any variations thereof.
According to an embodiment of the disclosure, processor 810 may perform the method flows described above in conjunction with fig. 2A, 3, or 4, and any variations thereof.
According to an embodiment of the present invention, at least one of the modules described above may be implemented as a computer program module as described with reference to fig. 8, which when executed by the processor 810 may implement the corresponding operations described above.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A display control method includes:
acquiring original image data of an image to be displayed as first image data, wherein the first image data is provided by an operating system and received and stored by a first cache group;
acquiring display parameters of the image to be displayed, wherein the display parameters are provided by management software, and are received and stored by a second cache group;
processing the first image data based on the display parameters to obtain processed image data serving as second image data, wherein the second image data is stored by a third cache group; and
controlling at least one display device to display based on the second image data;
wherein the first, second, and third cache groups are configured to store a plurality of information independently of each other.
2. The method of claim 1, wherein the presentation parameters comprise at least one of:
the display device is used for displaying the image to be displayed;
the size and/or position of the image to be displayed on the display area; or
The shape of the image to be displayed on the display area.
3. The method of claim 1, further comprising:
acquiring touch operation data;
processing the touch operation data based on the display parameters; and
and generating a touch instruction based on the processed touch operation data.
4. The method according to claim 1, wherein, in case there are a plurality of images to be presented,
the acquiring of the display parameters of the image to be displayed comprises:
acquiring display parameters corresponding to each image to be displayed in part or all of the images to be displayed,
the processing the first image data based on the display parameter to obtain processed image data as second image data includes:
processing first image data of an image to be displayed corresponding to the display parameters based on the display parameters to obtain at least one processing result; and
based on the at least one processing result, processed image data is generated as second image data.
5. An electronic device, comprising:
a processor; and
a memory having stored thereon machine-readable instructions that, when executed by the processor, cause the processor to perform:
acquiring original image data of an image to be displayed as first image data, wherein the first image data is provided by an operating system and received and stored by a first cache group;
acquiring display parameters of the image to be displayed, wherein the display parameters are provided by management software, and are received and stored by a second cache group;
processing the first image data based on the display parameters to obtain processed image data serving as second image data, wherein the second image data is stored by a third cache group; and
controlling at least one display device to display based on the second image data;
wherein the first, second, and third cache groups are configured to store a plurality of information independently of each other.
6. The electronic device of claim 5, wherein the presentation parameters include at least one of:
the display device is used for displaying the image to be displayed;
the size and/or position of the image to be displayed on the display area; or
The shape of the image to be displayed on the display area.
7. The electronic device of claim 5, wherein the processor further performs:
acquiring touch operation data;
processing the touch operation data based on the display parameters; and
and generating a touch instruction based on the processed touch operation data.
8. The electronic device of claim 5, wherein, in the presence of a plurality of images to be presented,
the acquiring of the display parameters of the image to be displayed comprises:
acquiring display parameters corresponding to each image to be displayed in part or all of the images to be displayed,
the processing the first image data based on the display parameter to obtain processed image data as second image data includes:
processing first image data of an image to be displayed corresponding to the display parameters based on the display parameters to obtain at least one processing result; and
based on the at least one processing result, processed image data is generated as second image data.
9. A display system, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring original image data of an image to be displayed as first image data, and the first image data is provided by an operating system and received and stored by a first cache group;
the second acquisition module is used for acquiring the display parameters of the image to be displayed, wherein the display parameters are provided by management software and are received and stored by a second cache group;
the first processing module is used for processing the first image data based on the display parameters to obtain processed image data serving as second image data, and the second image data is stored by a third cache group; and
the control module is used for controlling at least one display device to display based on the second image data;
wherein the first, second, and third cache groups are configured to store a plurality of information independently of each other.
10. A computer readable medium having stored thereon executable instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 4.
CN201810109012.6A 2018-02-02 2018-02-02 Display control method, display system, electronic device, and computer-readable medium Active CN108228074B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810109012.6A CN108228074B (en) 2018-02-02 2018-02-02 Display control method, display system, electronic device, and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810109012.6A CN108228074B (en) 2018-02-02 2018-02-02 Display control method, display system, electronic device, and computer-readable medium

Publications (2)

Publication Number Publication Date
CN108228074A CN108228074A (en) 2018-06-29
CN108228074B true CN108228074B (en) 2022-03-25

Family

ID=62669427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810109012.6A Active CN108228074B (en) 2018-02-02 2018-02-02 Display control method, display system, electronic device, and computer-readable medium

Country Status (1)

Country Link
CN (1) CN108228074B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110851749A (en) * 2018-07-25 2020-02-28 北京京东尚科信息技术有限公司 Picture display method and device, electronic equipment and readable medium
CN112199187B (en) * 2019-07-08 2024-06-18 北京字节跳动网络技术有限公司 Content display method, device, electronic equipment and computer readable storage medium
CN110490001B (en) * 2019-08-26 2023-05-16 西安闻泰电子科技有限公司 Method, device, equipment and storage medium for viewing image
CN116486759B (en) * 2023-04-11 2024-01-30 艺壹佳文化科技(广东)有限公司 Intelligent adjustment method, device, equipment and storage medium for identification display

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750389A (en) * 2015-03-31 2015-07-01 努比亚技术有限公司 Picture displaying method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5343560A (en) * 1986-06-27 1994-08-30 Hitachi, Ltd. Image data display system
TW201110013A (en) * 2009-09-03 2011-03-16 Inventec Corp System and method for adjusting display area and display content based on zoom magnification
CN102346538B (en) * 2010-08-04 2014-11-26 华硕电脑股份有限公司 Computer system
US8760489B1 (en) * 2013-06-25 2014-06-24 Vonage Network Llc Method and apparatus for dynamically adjusting aspect ratio of images during a video call
JP6418185B2 (en) * 2016-03-10 2018-11-07 トヨタ自動車株式会社 Image display system for vehicles

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750389A (en) * 2015-03-31 2015-07-01 努比亚技术有限公司 Picture displaying method and device

Also Published As

Publication number Publication date
CN108228074A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108228074B (en) Display control method, display system, electronic device, and computer-readable medium
US9710217B2 (en) Identifying the positioning in a multiple display grid
US10521468B2 (en) Animated seek preview for panoramic videos
US10255708B2 (en) Split image page generating apparatuses, methods, and computer-readable storage mediums, and image content displaying apparatuses
CN110070496B (en) Method and device for generating image special effect and hardware device
CN105027563A (en) Low latency image display on multi-display device
EP2892047A2 (en) Image data output control method and electronic device supporting the same
CN110806847A (en) Distributed multi-screen display method, device, equipment and system
US9424814B2 (en) Buffer display techniques
CN113379882A (en) Network vehicle exhibition configuration method, computing device and storage medium
KR20160022362A (en) Synchronization points for state information
KR20150130307A (en) Graphics processing using multiple primitives
CN107728982B (en) Image processing method and system
CN109214977B (en) Image processing apparatus and control method thereof
WO2023005751A1 (en) Rendering method and electronic device
CN108089929A (en) A kind of method and device with plug-in unit operation application program
CN110673738B (en) Interaction method and electronic equipment
US20180204377A1 (en) Display apparatus and method for image processing
US20170228136A1 (en) Content providing method, content providing apparatus, and computer program stored in recording medium for executing the content providing method
US10719286B2 (en) Mechanism to present in an atomic manner a single buffer that covers multiple displays
KR102176681B1 (en) Electronic device and method for displaying object
CN108416847B (en) Method and device for displaying operation object
KR20170086201A (en) The method for providing user with mobile game based on augmented reality
US11310415B2 (en) System having camera application comprising multiple camera packages and control method thereof
JP2014239795A5 (en)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant