CN109275016B - Display control method and display control apparatus - Google Patents

Display control method and display control apparatus Download PDF

Info

Publication number
CN109275016B
CN109275016B CN201811156094.6A CN201811156094A CN109275016B CN 109275016 B CN109275016 B CN 109275016B CN 201811156094 A CN201811156094 A CN 201811156094A CN 109275016 B CN109275016 B CN 109275016B
Authority
CN
China
Prior art keywords
display
instruction
layer
event
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811156094.6A
Other languages
Chinese (zh)
Other versions
CN109275016A (en
Inventor
胡胜杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811156094.6A priority Critical patent/CN109275016B/en
Publication of CN109275016A publication Critical patent/CN109275016A/en
Application granted granted Critical
Publication of CN109275016B publication Critical patent/CN109275016B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration

Abstract

The present disclosure provides a display control method, including: detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: displaying the second object in at least a partial area within the display area of the first object; executing a first instruction based on at least the display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object. The present disclosure also provides a display control apparatus.

Description

Display control method and display control apparatus
Technical Field
The present disclosure relates to a display control method and a display control apparatus.
Background
With the rapid development of electronic technology, various electronic devices are increasingly applied to many scenes such as life and work, and the functions of the electronic devices are also increasingly diversified. In the process of playing the video by the electronic device, other pictures often appear to cover the played video, so that the user cannot watch the complete video.
Disclosure of Invention
One aspect of the present disclosure provides a display control method including: detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: displaying the second object in at least a partial area within the display area of the first object, and executing a first instruction based on at least a display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object.
Optionally, the executing, if there is a display event of the second object, the first instruction based on at least the display event of the second object includes: executing the first instruction if the display event of the second object meets a predetermined condition, and executing a second instruction different from the first instruction if the display event of the second object does not meet the predetermined condition, wherein the second instruction is an instruction for displaying the second object.
Optionally, the display event of the second object does not satisfy a predetermined condition, and includes at least one of the following: the second object is generated after the input operation is detected, or the second object and the first object are from different applications.
Optionally, the first instruction includes an adjustment instruction, and the executing the first instruction based on at least the display event of the second object includes: executing the adjusting instruction at least based on the display event of the second object, wherein the adjusting instruction is used for reducing the perception rate of the second object.
Optionally, the adjusting instruction is configured to reduce a perception rate of the second object, and includes at least one of: the adjusting instruction is used for changing the display position of the second object, or the adjusting instruction is used for reducing the display size of the second object, or the adjusting instruction is used for improving the display transparency of the second object, or the adjusting instruction is used for preventing the display of the second object.
Optionally, the first instruction includes a control display instruction, and the executing the first instruction based on at least the display event of the second object includes: executing the control display instruction at least based on the display event of the second object, wherein the control is used for controlling the perception rate of the second object.
Optionally, the control includes at least one of: the display control device comprises a closing control, a zooming control and an adjusting control, wherein the closing control is used for closing the second object, the zooming control is used for adjusting the display size of the second object, and the adjusting control is used for adjusting the display transparency of the second object.
Optionally, the first object includes a first display layer, the second object includes a second display layer, and the detecting a display event of the second object includes: and acquiring layer identifiers of the first display layer and the second display layer, wherein the layer identifiers are used for representing layer levels of the first display layer and the second display layer, and determining whether a display event of the second display layer exists based on the layer identifiers.
Optionally, the determining whether the display event of the second display layer exists based on the layer identifier includes: and determining the layer levels of the first display layer and the second display layer based on the layer identifications of the first display layer and the second display layer, wherein if the layer level of the second display layer is greater than the layer level of the first display layer, the second display layer covers at least part of the first display layer.
Another aspect of the present disclosure provides a display control apparatus including: the display device comprises a display interface and a processor, wherein the display interface is used for outputting a control signal which can control a display device to display a first object and a second object, and the processor is used for detecting a display event of the second object when the first object is in a display state, wherein the display event of the second object comprises: displaying the second object in at least a partial area within the display area of the first object, and executing a first instruction based on at least a display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object.
Another aspect of the present disclosure provides a display control apparatus including: the device comprises a detection module and an execution module. The detection module is used for detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: the second object is displayed in at least partial area in the display area of the first object, and the execution module is used for executing a first instruction at least based on the display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling the display of the second object.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 schematically illustrates an application scenario of a display control method and a display control apparatus according to an embodiment of the present disclosure;
2A-2B schematically illustrate a flow chart of a display control method according to an embodiment of the present disclosure;
2C-2D schematically illustrate types of second objects according to embodiments of the present disclosure;
3A-3D schematically illustrate a schematic diagram of adjusting the display of a second object, according to an embodiment of the disclosure;
4A-4C schematically illustrate diagrams of adjusting a second corresponding display by a control, according to an embodiment of the present disclosure;
fig. 5 schematically shows a block diagram of a display control apparatus according to an embodiment of the present disclosure;
FIG. 6 schematically illustrates a block diagram of a display control apparatus according to an embodiment of the present disclosure; and
fig. 7 schematically illustrates a block diagram of a computer system for a display control method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
An embodiment of the present disclosure provides a display control method, including: detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: and displaying the second object in at least a partial area within the display area of the first object, and executing a first instruction based on at least a display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object.
Fig. 1 schematically illustrates an application scenario of a display control method and a display control apparatus according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1, the application scenario 100 includes an electronic device 110.
According to the embodiment of the present disclosure, the electronic device 110 is, for example, a device with a display function, and the electronic device 110 may be a computer, a mobile phone, a tablet, or the like.
In the disclosed embodiment, the display of the electronic device 110 may display a first image 111, and the first image 111 may be a picture or a video. For example, when a user views a video using the electronic device 110, the video may be displayed via a display of the electronic device.
As shown in the left diagram of fig. 1, the display of the electronic device 110 can also display a second image 112, the second image 112 can be a picture or a video, the second image 112 is different from the first image 111, for example, the second image 112 can be displayed on the first image 111. For example, when the user is viewing a video (the first image 111), the second image 112 is an advertisement overlaid on the video viewed by the user, which may be a picture or a video, and which overlays at least a portion of the video viewed by the user, affecting the user's viewing of the entire video.
As shown in the right diagram of fig. 1, the electronic device 110 may automatically control the display state of the second image 112, for example, automatically turn off the display of the second image 112, by detecting whether the first image 111 is covered with the second image 112, in the case that the second image 112 is detected to cover the first image 111, so as to ensure that the user can not influence the first image 111.
Fig. 2A to 2B schematically show a flowchart of a display control method according to an embodiment of the present disclosure.
As shown in fig. 2A, the method includes operations S210 to S220.
In operation S210, if the first object is in a display state, a display event of a second object is detected, wherein the display event of the second object includes: the second object is displayed in at least a partial area within the display area of the first object.
According to an embodiment of the present disclosure, the first object may be, for example, an image, a video, or the like, which may be displayed through a display unit of the electronic device. For example, when a user watches a video using the electronic device, the first object is the video watched by the user. Wherein, when the first object is in a display state, the first object displayed on the display unit includes a display area.
In the disclosed embodiment, the second object may be, for example, an image, a video, or the like displayed by a display of the electronic device, the second object being a different object from the first object.
Wherein, in case the first object is in a display state, a display event of the second object is detected, the display event for example comprising a display of the second object within at least a part of the display area of the first object. For example, when the second object is in the display state, part of the display content of the first object is occluded. Specifically, when the user is viewing a video (first object), the second object (advertisement) may be an object that is in a display state simultaneously with the first object, and the display event of the second object may be, for example, that the advertisement blocks the video viewed by the user.
In operation S220, if there is a display event of the second object, a first instruction is executed based on at least the display event of the second object, wherein the first instruction is an instruction associated with controlling display of the second object.
According to the embodiment of the disclosure, if there is a display event of the second object, for example, the second object is displayed in the display area of the first object and the like, a first instruction is executed, where the first instruction can be used to control the display mode of the second object, for example, to control the display mode of the second object, so that the second object does not obstruct the display content of the first object, and specifically, for example, the first instruction may be used to close the display of the second object, or to move the display position of the second object, or to adjust the display transparency of the second object so as not to obstruct the display content of the first object, and the like.
As shown in fig. 2B, operation S220 includes operations S221 to S222.
In operation S221, if the display event of the second object satisfies a predetermined condition, a first instruction is executed.
According to an embodiment of the present disclosure, the predetermined condition may be, for example, that the second object is an object from the same application as the first object, for example, when the user is watching a video using an application, the second object is an advertisement pushed by the application, in other words, the advertisement is from the same application as the video watched by the user. Alternatively, the appearance of the second object is not generated by the user's operation, that is, the second object appears automatically, and the appearance event of the second object does not satisfy the user's desire, for example, when the user watches a video, the video watched by the user is blocked by the appearing advertisement, and the desire of the user to watch the video is not satisfied by the appearance of the advertisement.
According to the embodiment of the disclosure, when the display event of the second object meets the preset condition, a first instruction is executed, the first instruction is used for controlling the display event of the first object, for example, the first instruction is to close the display of the second object, move the display position of the second object, or adjust the display transparency of the second object so as not to obstruct the display content of the first object, and the like.
Fig. 2C-2D schematically illustrate types of second objects according to an embodiment of the present disclosure.
In operation S222, if the display event of the second object does not satisfy the predetermined condition, a second instruction different from the first instruction is executed, the second instruction being an instruction for displaying the second object.
According to an embodiment of the present disclosure, the display event of the second object does not satisfy a predetermined condition, including at least one of: the second object is generated after the input operation is detected, or the second object and the first object are from different applications.
Wherein the display event of the second object not meeting the predetermined condition includes that the second object is an object generated according to an input operation of a user.
As shown in fig. 2C, when the user is viewing a video (first object), the input operation of the user is an operation performed by the user that needs to adjust the display of the first object. Specifically, when a user watches a video, when the user needs to pause or play the video, or when the user needs to adjust the sound of the video or the brightness of the video, and the like, the user usually clicks a playing interface of the video, and when the electronic device receives a click operation of the user, a progress bar, a playing or pausing icon, a sound or brightness adjustment icon, and the like for adjusting the video playing are called in a display area of a first object, and at this time, a second object generated by detecting an input operation of the user is the progress bar, the playing or pausing icon, the sound or brightness adjustment icon, and the like generated by receiving the click operation of the user.
As shown in FIG. 2D, the display event of the second object is not satisfied with the predetermined condition, including that the second object is from a different application than the first object. For example, the first object is a video played by a video playing application program, and the second object is short message information received by a short message application program. Of course, the predetermined condition may also be considered to be not met if the second object is from the operating system.
According to the embodiment of the present disclosure, when the display event of the second object does not satisfy the predetermined condition, a second instruction is executed, wherein the second instruction is different from the first instruction, and the second instruction may be, for example, for controlling the display of the second object. For example, when the second object is a progress bar, a play or pause icon, a sound or brightness adjustment icon generated by receiving a click operation of the user, or the second object is received short message information, the second object is displayed.
In the embodiment of the present disclosure, if the display event of the second object does not satisfy the predetermined condition, indicating that the second object includes the information required by the user, it is not necessary to reduce the perception rate of the user on the second object. For example, if the second object is generated in response to an input operation by a user, or the generation source of the second object cannot be determined (for example, the second object is from a different application than the first object), the second object does not need to be an interference factor of the first object, and the second object can be normally displayed.
The embodiment of the disclosure can automatically and intelligently analyze the type of the second object and execute different instructions aiming at different types of second objects. If the second object is generated in response to a user input operation or cannot be determined to be generated from (e.g., from a different application than the first object), normally displaying the second object; or, the type of the second object is determined in an excluding mode, and if the second object is neither generated by responding to the input operation of the user nor from an application different from the first object, the second object is intelligently identified as the interference content, and the perception rate of the user on the second object is reduced.
It is to be understood that the example in the embodiment of the present disclosure that the display event of the second object does not satisfy the predetermined condition (the second object is the second object generated after the input operation is detected, or the second object and the first object are objects from different applications) is to facilitate understanding of the technical solution of the present disclosure, the type of the second object of the present disclosure should not be limited to the example of the above embodiment, and a person skilled in the art may set the preset condition of the display event of the second object according to the actual application requirement, for example, the display event of the second object does not satisfy the preset condition and may further include that the second object is an object from a specific application, which can be specified according to the actual situation.
Fig. 3A-3D schematically illustrate schematic diagrams of adjusting the display of a second object according to an embodiment of the disclosure.
In an embodiment of the disclosure, the first instruction includes an adjustment instruction, and the executing the first instruction based on at least a display event of the second object includes: and executing an adjusting instruction at least based on the display event of the second object, wherein the adjusting instruction is used for reducing the perception rate of the second object.
According to an embodiment of the present disclosure, the adjustment instruction may adjust the display of the second object, for example, to reduce a perception rate of the second object, for example, a degree of perception of the second object by the user, in other words, a degree of perception of the second object that is reduced as the perception rate is reduced by the user is reduced, so that the influence of the second object on the first object being displayed is small.
According to the embodiment of the present disclosure, the adjusting instruction can reduce the perception rate of the second object, including reducing the perception rate of the second object while keeping the second object in the display state, for example, by adjusting the display position, the display size, and the display transparency of the second object, in the case where the second object is in the display state, and the second object blocks the display content of the first object.
Specifically, the adjusting instruction is used for reducing the perception rate of the second object, and comprises at least one of the following: the adjusting instruction is used for changing the display position of the second object, or the adjusting instruction is used for reducing the display size of the second object, or the adjusting instruction is used for improving the display transparency of the second object, or the adjusting instruction is used for preventing the display of the second object.
The specific adjustment process is shown in fig. 3A to 3D.
As shown in fig. 3A, the adjustment instruction is for changing the display position of the second object. For example, the adjustment instruction can be used to control the second object to move to an edge region, which may be an edge position of the display region of the first object, and more specifically, to control the second object to move to a nearest edge region, so that the second object does not block the main display content of the first object, thereby reducing the perception degree of the second object at the edge by the user and reducing the influence of the second object when the user views the first object.
As shown in fig. 3B, the adjustment instruction is for reducing the display size of the second object. For example, the degree of occlusion of the first object by the reduced-size second object is reduced, reducing the perception of the small-size second object by the user, which reduces the impact on the user viewing the first object.
As shown in fig. 3C, the adjustment instruction is used to increase the display transparency of the second object. For example, the shielding capability of the second object for the first object after the display transparency is increased is reduced, that is, the user can watch the display content of the first object shielded by the second object through the second object. When the user views the first object, the perception degree of the second object with high transparency is reduced.
As shown in fig. 3D, the adjustment instruction is to prevent display of the second object. For example, the adjustment instruction can prevent display of the second object, the second object that fails to be displayed not causing an occlusion of the first object, thereby not affecting the user's view of the first object.
4A-4C schematically illustrate diagrams of adjusting a second corresponding display via a control, according to embodiments of the present disclosure.
According to an embodiment of the present disclosure, the first instruction includes a control display instruction, and the executing the first instruction based on at least a display event of the second object includes: and executing a control display instruction at least based on the display event of the second object, wherein the control is used for controlling the perception rate of the second object.
According to an embodiment of the present disclosure, the first instruction may also be a control display instruction, which is used to display different types of controls through which the user can control the perception rate of the second object, for example, the controls can receive operations made by the user, wherein the perception rate of the second object is controlled by the received user operations.
Wherein the different types of controls are for example as described below.
For example, the controls include at least one of: closing control, zooming control and adjusting control. The closing control is used for closing the second object, the zooming control is used for adjusting the display size of the second object, and the adjusting control is used for adjusting the display transparency of the second object.
The specific adjustment process is shown in fig. 4A to 4C.
As shown in fig. 4A, the close control is used to close the second object. The closing control may be displayed in a display area of the first object, for example, and the user may close the second object by clicking on the closing control.
As shown in fig. 4B, the zoom control is used to adjust the display size of the second object. For example, the zoom control may be displayed within the display area of the first object, the zoom control may be a slider, and the user may resize the display size of the second object by dragging the slider, e.g., may zoom in or out on the display size of the second object. Alternatively, the zoom control may further include a zoom-out icon and a zoom-in icon, and the user may adjust the display size of the second object by clicking the zoom-out icon or the zoom-in icon.
As shown in fig. 4C, the adjustment control is used to adjust the display transparency of the second object. For example, the adjustment control may be displayed within the display area of the first object, the adjustment control may be a slider, and the user may adjust the display transparency of the second object by dragging the slider, e.g., may increase or decrease the display transparency of the second object.
In an embodiment of the disclosure, the first object comprises a first display layer and the second object comprises a second display layer.
The display layers are layers for displaying images or videos, the display layers are hierarchical, different display layers can be used for displaying different images or videos, the display layers can be displayed in a stacking mode, and the upper layer can block the display of the lower layer. For example, a first object (video) viewed by a user is a first display layer, and a second display layer (e.g., advertisement) located above the first display layer can block display content of the first display layer.
In an embodiment of the present disclosure, detecting a display event of the second object includes: and acquiring layer identifiers of the first display layer and the second display layer, wherein the layer identifiers are used for representing layer levels of the first display layer and the second display layer.
Each display layer has a corresponding layer identifier, the layer identifiers can be used for representing the hierarchical relationship of the layers, and the stacking condition of the layers can be obtained through the layer identifiers, for example, the layer identifier of a first display layer is "0", the layer identifier of a second display layer is "1", the larger the layer identifier is, the higher the layer hierarchy is, for example, the second display layer is located on the first display layer, when the first display layer is a video watched by a user, the larger the second display layer is, the advertisement bounced out when the user watches the video is displayed on the video watched by the user, and the advertisement blocks the video content.
And determining whether a display event of a second display layer exists based on the layer identification.
According to the embodiment of the present disclosure, the display event of the second display layer may be, for example, that the second display layer is displayed in a partial display area of the first display layer, that is, the second display layer is located on the first display layer and blocks partial display content of the first display layer. The stacking relationship between the first display layer and the second display layer can be confirmed through the layer identifiers of the first display layer and the second display layer, that is, whether the second display layer is located on the first display layer or not can be confirmed.
For example, based on the layer identifiers of the first display layer and the second display layer, the layer levels of the first display layer and the second display layer are determined, and if the layer level of the second display layer is greater than the layer level of the first display layer, the second display layer covers at least part of the first display layer.
For example, by obtaining layer identifiers of a plurality of current layers, whether the second display layer is located on the first display layer is determined according to the layer identifiers. Specifically, for example, if the obtained layer identifier of the first display layer is "0", the obtained layer identifier of the second display layer is "1", and the layer level of the second display layer is higher than the layer level of the first display layer, it may be indicated that the second display layer is located on the first display layer, and it further indicates that the second display layer covers at least part of the first display layer.
Fig. 5 schematically shows a block diagram of a display control apparatus according to an embodiment of the present disclosure.
As shown in fig. 5, the display control apparatus 500 includes a display device 510, a display interface 520, and a processor 530.
Wherein the display device 510 may be used for the display device to display the first object and the second object.
The display interface 520 may be used to output a control signal capable of controlling the display device to display the first object and the second object.
The processor 530 may be configured to detect a display event of a second object when the first object is in a display state, wherein the display event of the second object includes: and displaying the second object in at least a partial area within the display area of the first object, and executing a first instruction based on at least a display event of the second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object. According to the embodiment of the disclosure, the processor 530 may perform, for example, the operations S210 to S220 described above with reference to fig. 2A, which are not described herein again.
Wherein, if there is a display event of the second object, executing the first instruction based on at least the display event of the second object comprises: the first instruction is executed if the display event of the second object satisfies a predetermined condition, and a second instruction different from the first instruction is executed if the display event of the second object does not satisfy the predetermined condition, the second instruction being an instruction for displaying the second object.
Wherein the display event of the second object does not satisfy a predetermined condition, including at least one of: the second object is generated after the input operation is detected, or the second object and the first object are from different applications.
Wherein the first instruction comprises an adjustment instruction, and the executing of the first instruction based on at least the display event of the second object comprises: and executing an adjusting instruction at least based on the display event of the second object, wherein the adjusting instruction is used for reducing the perception rate of the second object.
Wherein the adjusting instruction is used for reducing the perception rate of the second object and comprises at least one of the following: the adjusting instruction is used for changing the display position of the second object, or the adjusting instruction is used for reducing the display size of the second object, or the adjusting instruction is used for improving the display transparency of the second object, or the adjusting instruction is used for preventing the display of the second object.
Wherein the first instruction comprises a control display instruction, and the executing the first instruction at least based on the display event of the second object comprises: and executing a control display instruction at least based on the display event of the second object, wherein the control is used for controlling the perception rate of the second object.
Wherein the control comprises at least one of: the control device comprises a closing control, a zooming control and an adjusting control, wherein the closing control is used for closing the second object, the zooming control is used for adjusting the display size of the second object, and the adjusting control is used for adjusting the display transparency of the second object.
The first object comprises a first display layer, the second object comprises a second display layer, and the detecting the display event of the second object comprises: and obtaining layer identifiers of the first display layer and the second display layer, wherein the layer identifiers are used for representing layer levels of the first display layer and the second display layer, and determining whether a display event of the second display layer exists based on the layer identifiers.
Determining whether a display event of a second display layer exists based on the layer identifier comprises the following steps: and determining the layer levels of the first display layer and the second display layer based on the layer identifiers of the first display layer and the second display layer, wherein if the layer level of the second display layer is greater than the layer level of the first display layer, the second display layer covers at least part of the first display layer.
FIG. 6 schematically shows a block diagram of a display control apparatus according to an embodiment of the present disclosure
As shown in fig. 6, the display control apparatus 600 includes a detection module 610 and an execution module 620. The display control apparatus 600 may perform the method described above with reference to fig. 2A.
Specifically, the detecting module 610 may be configured to detect a display event of a second object if the first object is in a display state, where the display event of the second object includes: the second object is displayed in at least a partial area within the display area of the first object. According to the embodiment of the present disclosure, the detecting module 610 may perform, for example, the operation S210 described above with reference to fig. 2A, which is not described herein again.
The execution module 620 may be configured to execute a first instruction based on at least a display event of a second object if the display event of the second object exists, wherein the first instruction is an instruction associated with controlling display of the second object. According to the embodiment of the present disclosure, the executing module 620 may, for example, execute the operation S220 described above with reference to fig. 2A, which is not described herein again.
Wherein, if there is a display event of the second object, executing the first instruction based on at least the display event of the second object comprises: the first instruction is executed if the display event of the second object satisfies a predetermined condition, and a second instruction different from the first instruction is executed if the display event of the second object does not satisfy the predetermined condition, the second instruction being an instruction for displaying the second object.
Wherein the display event of the second object does not satisfy a predetermined condition, including at least one of: the second object is generated after the input operation is detected, or the second object and the first object are from different applications.
Wherein the first instruction comprises an adjustment instruction, and the executing of the first instruction based on at least the display event of the second object comprises: and executing an adjusting instruction at least based on the display event of the second object, wherein the adjusting instruction is used for reducing the perception rate of the second object.
Wherein the adjusting instruction is used for reducing the perception rate of the second object and comprises at least one of the following: the adjusting instruction is used for changing the display position of the second object, or the adjusting instruction is used for reducing the display size of the second object, or the adjusting instruction is used for improving the display transparency of the second object, or the adjusting instruction is used for preventing the display of the second object.
Wherein the first instruction comprises a control display instruction, and the executing the first instruction at least based on the display event of the second object comprises: and executing a control display instruction at least based on the display event of the second object, wherein the control is used for controlling the perception rate of the second object.
Wherein the control comprises at least one of: the control device comprises a closing control, a zooming control and an adjusting control, wherein the closing control is used for closing the second object, the zooming control is used for adjusting the display size of the second object, and the adjusting control is used for adjusting the display transparency of the second object.
The first object comprises a first display layer, the second object comprises a second display layer, and the detecting the display event of the second object comprises: and obtaining layer identifiers of the first display layer and the second display layer, wherein the layer identifiers are used for representing layer levels of the first display layer and the second display layer, and determining whether a display event of the second display layer exists based on the layer identifiers.
Determining whether a display event of a second display layer exists based on the layer identifier comprises the following steps: and determining the layer levels of the first display layer and the second display layer based on the layer identifiers of the first display layer and the second display layer, wherein if the layer level of the second display layer is greater than the layer level of the first display layer, the second display layer covers at least part of the first display layer.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any of the detecting module 610 and the executing module 620 may be combined and implemented in one module, or any one of the modules may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the detecting module 610 and the executing module 620 may be implemented at least partially as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware by any other reasonable manner of integrating or packaging a circuit, or may be implemented in any one of three implementations of software, hardware, and firmware, or in a suitable combination of any of them. Alternatively, at least one of the detecting module 610 and the executing module 620 may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
Fig. 7 schematically illustrates a block diagram of a computer system for a display control method according to an embodiment of the present disclosure. The computer system illustrated in FIG. 7 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 7, a computer system 700 for a display control method includes a processor 701, a computer-readable storage medium 702. The system 700 may perform a method according to an embodiment of the present disclosure.
In particular, processor 710 may comprise, for example, a general purpose microprocessor, an instruction set processor and/or associated chipset, and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 710 may also include on-board memory for caching purposes. Processor 710 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 720, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 720 may include a computer program 721, which computer program 721 may include code/computer-executable instructions that, when executed by the processor 710, cause the processor 710 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 721 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 721 may include one or more program modules, including 721A, modules 721B, … …, for example. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 710 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 710.
According to an embodiment of the present disclosure, at least one of the detecting module 610 and the executing module 620 may be implemented as a computer program module described with reference to fig. 7, which, when executed by the processor 710, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (8)

1. A display control method comprising:
detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: displaying the second object in at least partial area in the display area of the first object, wherein the first object comprises a first display layer and the second object comprises a second display layer;
if the display event of the second object exists and the display event of the second object meets the preset condition, executing a first instruction at least based on the display event of the second object, wherein the first instruction comprises a control display instruction for controlling display of a control, the control is used for controlling the perception rate of the second object, and the display event of the second object meets the preset condition and comprises the following steps: the occurrence of the second object does not meet user expectations;
executing a second instruction different from the first instruction if the display event of the second object does not satisfy a predetermined condition, the second instruction being an instruction for displaying the second object.
2. The method of claim 1, wherein the display event of the second object fails to satisfy a predetermined condition, comprising at least one of:
the second object is generated after the input operation is detected; or
The second object and the first object are from objects of different applications.
3. The method of claim 1, wherein:
the first instruction comprises an adjustment instruction;
the executing of the first instruction based on at least the display event of the second object comprises: executing the adjusting instruction at least based on the display event of the second object, wherein the adjusting instruction is used for reducing the perception rate of the second object.
4. The method of claim 3, wherein the adjustment instructions are for causing a perception rate of the second object to be reduced, comprising at least one of:
the adjustment instruction is used for changing the display position of the second object; or
The adjustment instruction is used for reducing the display size of the second object; or
The adjusting instruction is used for improving the display transparency of the second object; or
The adjustment instruction is to prevent display of the second object.
5. The method of claim 1, wherein:
the control comprises at least one of: closing the control, zooming the control and adjusting the control;
wherein the close control is to close the second object; the zoom control is used for adjusting the display size of the second object; the adjusting control is used for adjusting the display transparency of the second object.
6. A display control method comprising:
detecting a display event of a second object if the first object is in a display state, wherein the display event of the second object comprises: displaying the second object in at least partial area in the display area of the first object, wherein the first object comprises a first display layer and the second object comprises a second display layer;
wherein the detecting a display event of the second object comprises:
acquiring layer identifiers of a first display layer and a second display layer, wherein the layer identifiers are used for representing layer levels of the first display layer and the second display layer;
determining whether a display event of the second display layer exists based on the layer identification;
and if the display event of the second object exists, executing a first instruction at least based on the display event of the second object, wherein the first instruction comprises a control display instruction for controlling the display of a control, and the control is used for controlling the perception rate of the second object.
7. The method of claim 6, wherein the determining whether the display event of the second display layer exists based on the layer identification comprises:
determining layer levels of the first display layer and the second display layer based on the layer identifiers of the first display layer and the second display layer;
and if the layer level of the second display layer is larger than that of the first display layer, the second display layer covers at least part of the first display layer.
8. A display control apparatus comprising:
a display interface for outputting a control signal capable of controlling a display device to display a first object and a second object;
a processor configured to detect a display event of the second object when the first object is in a display state, wherein the display event of the second object includes: displaying the second object in at least partial area in the display area of the first object, wherein the first object comprises a first display layer and the second object comprises a second display layer;
if the display event of the second object exists and the display event of the second object meets the preset condition, executing a first instruction at least based on the display event of the second object, wherein the first instruction comprises a control display instruction for controlling display of a control, the control is used for controlling the perception rate of the second object, and the display event of the second object meets the preset condition and comprises the following steps: the occurrence of the second object does not meet user expectations;
executing a second instruction different from the first instruction if the display event of the second object does not satisfy a predetermined condition, the second instruction being an instruction for displaying the second object.
CN201811156094.6A 2018-09-30 2018-09-30 Display control method and display control apparatus Active CN109275016B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811156094.6A CN109275016B (en) 2018-09-30 2018-09-30 Display control method and display control apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811156094.6A CN109275016B (en) 2018-09-30 2018-09-30 Display control method and display control apparatus

Publications (2)

Publication Number Publication Date
CN109275016A CN109275016A (en) 2019-01-25
CN109275016B true CN109275016B (en) 2021-11-16

Family

ID=65196372

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811156094.6A Active CN109275016B (en) 2018-09-30 2018-09-30 Display control method and display control apparatus

Country Status (1)

Country Link
CN (1) CN109275016B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109885231B (en) * 2019-02-27 2021-07-02 北京市商汤科技开发有限公司 Display method and device, electronic equipment and storage medium
CN110674818B (en) * 2019-12-03 2020-04-24 捷德(中国)信息科技有限公司 Card surface detection method, device, equipment and medium
CN111212313A (en) * 2019-12-13 2020-05-29 珠海格力电器股份有限公司 Advertisement display method, device, storage medium and computer equipment
CN111966273A (en) * 2020-08-18 2020-11-20 珠海格力电器股份有限公司 Information display processing method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103067758A (en) * 2012-12-17 2013-04-24 鸿富锦精密工业(深圳)有限公司 Advertising server and player terminal and system and method of advertising push
CN105578255A (en) * 2015-12-31 2016-05-11 合一网络技术(北京)有限公司 Advertisement playing method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7661117B2 (en) * 2000-11-27 2010-02-09 Intellocity Usa Inc. Displaying full screen streaming media advertising
WO2012047659A1 (en) * 2010-09-27 2012-04-12 Hulu Llc Method and apparatus for providing directed advertising based on user preferences
CN105898511A (en) * 2015-12-08 2016-08-24 乐视网信息技术(北京)股份有限公司 Advertisement display method and device
CN106844731A (en) * 2017-02-10 2017-06-13 宇龙计算机通信科技(深圳)有限公司 Advertisement shields method and system
CN107391012B (en) * 2017-07-18 2019-07-26 维沃移动通信有限公司 A kind of information cuing method and mobile terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103067758A (en) * 2012-12-17 2013-04-24 鸿富锦精密工业(深圳)有限公司 Advertising server and player terminal and system and method of advertising push
CN105578255A (en) * 2015-12-31 2016-05-11 合一网络技术(北京)有限公司 Advertisement playing method and device

Also Published As

Publication number Publication date
CN109275016A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109275016B (en) Display control method and display control apparatus
CN111066315B (en) Apparatus, method and readable medium configured to process and display image data
EP3526964B1 (en) Masking in video stream
EP3369038B1 (en) Tracking object of interest in an omnidirectional video
CN107430629B (en) Prioritized display of visual content in a computer presentation
CN103765346B (en) The position selection for being used for audio-visual playback based on eye gaze
US20130021488A1 (en) Adjusting Image Capture Device Settings
US20090185745A1 (en) Electronic Apparatus and Image Display Method
US20150149960A1 (en) Method of generating panorama image, computer-readable storage medium having recorded thereon the method, and panorama image generating device
US9870800B2 (en) Multi-source video input
CN110825289A (en) Method and device for operating user interface, electronic equipment and storage medium
US10158805B2 (en) Method of simultaneously displaying images from a plurality of cameras and electronic device adapted thereto
KR102089624B1 (en) Method for object composing a image and an electronic device thereof
CN111143906A (en) Control method and control device
US10789987B2 (en) Accessing a video segment
US9633253B2 (en) Moving body appearance prediction information processing system, and method
US20220217322A1 (en) Apparatus, articles of manufacture, and methods to facilitate generation of variable viewpoint media
US20140003653A1 (en) System and Method for Detemining the Position of an Object Displaying Media Content
CN113965665A (en) Method and equipment for determining virtual live broadcast image
US9525816B2 (en) Display control apparatus and camera system
US10685621B1 (en) Contextual display dimension control in rollable display device to reduce the addiction of mobile device
US10645306B2 (en) Method for producing media file and electronic device thereof
CN108960130B (en) Intelligent video file processing method and device
CN105635832A (en) Video processing method and device
US20230305303A1 (en) User Control in Augmented Reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant