CN107908324B - Interface display method and device - Google Patents

Interface display method and device Download PDF

Info

Publication number
CN107908324B
CN107908324B CN201711121352.2A CN201711121352A CN107908324B CN 107908324 B CN107908324 B CN 107908324B CN 201711121352 A CN201711121352 A CN 201711121352A CN 107908324 B CN107908324 B CN 107908324B
Authority
CN
China
Prior art keywords
head portrait
interface
picture
area
avatar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711121352.2A
Other languages
Chinese (zh)
Other versions
CN107908324A (en
Inventor
吴艺
王海新
李文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN201711121352.2A priority Critical patent/CN107908324B/en
Publication of CN107908324A publication Critical patent/CN107908324A/en
Priority to PCT/CN2018/105375 priority patent/WO2019095815A1/en
Priority to TW107138523A priority patent/TWI673644B/en
Application granted granted Critical
Publication of CN107908324B publication Critical patent/CN107908324B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • G06F16/743Browsing; Visualisation therefor a collection of video files or sequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows

Abstract

The disclosure relates to an interface display method and device. The method is used for the terminal equipment and comprises the following steps: when the interface movement is detected, judging whether the movement meets the triggering condition of highlighting the head portrait; and when the trigger condition is met, highlighting the head portrait related to the picture in the picture contained in the interface. According to the interface display method and device provided by the embodiment of the disclosure, when the detected movement of the interface meets the trigger condition of highlighting the head portrait, the head portrait related to the screen is highlighted in the screen contained in the interface. The method and the device can ensure that the user determines the content displayed by the picture by checking the highlighted head portrait in the process of checking the picture displayed by the interface, avoid missing the interested content and save the checking time.

Description

Interface display method and device
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an interface display method and apparatus.
Background
In the related art, a user often controls the movement of an interface through a mobile phone or other terminal device in an operation manner such as sliding or dragging, so as to browse (or scan) contents displayed in the interface and find out contents in which the user is interested. However, in the process of the user rapidly moving through the operation control interface, the content in the interface does not change, and the user only rapidly moves along the direction of sliding or dragging. When the moving speed is too fast or the content is too much, the user cannot clearly view the content displayed on the interface in time in the fast browsing process, even misses the content of interest, and the user's requirements cannot be met.
Disclosure of Invention
In view of this, the present disclosure provides an interface display method and apparatus, so as to solve the problem that a user cannot timely and clearly view content displayed on an interface during an interface display process, or even misses an interested content.
According to a first aspect of the present disclosure, there is provided an interface display method, where the method is used for a terminal device, and the method includes:
when the interface movement is detected, judging whether the movement meets the triggering condition of highlighting the head portrait;
and when the trigger condition is met, highlighting the head portrait related to the picture in the picture contained in the interface.
For the above method, in a possible implementation manner, the determining whether the movement satisfies a trigger condition for highlighting the avatar includes:
judging whether the interface moving speed exceeds a speed threshold value or not;
determining that the movement satisfies the trigger condition when the interface movement speed exceeds a speed threshold.
For the above method, in one possible implementation, highlighting the avatar associated with the frame includes:
and covering a floating layer on the picture, wherein the region of the floating layer corresponding to the head portrait is transparent, and the other regions are non-transparent or semitransparent.
For the above method, in one possible implementation, highlighting the avatar associated with the frame includes:
performing at least one of the following processing on the area except the area where the head portrait is located on the picture: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening.
For the above method, in one possible implementation, highlighting the avatar associated with the frame includes:
at least one of the following processing is carried out on the area where the head portrait is located in the picture: the method comprises the steps of improving the brightness of the area where the head portrait is located, improving the chroma of the area where the head portrait is located, enabling the area where the head portrait is located to be displayed in the middle of the picture, enabling the area where the head portrait is located to be displayed dynamically, and enabling the area where the head portrait is located to be displayed in the picture in an enlarged mode.
For the above method, in one possible implementation, highlighting the avatar associated with the frame includes:
determining relevant information corresponding to head portraits related to pictures contained in the interface;
and when the related information is matched with the historical behaviors of the user, highlighting the head portrait corresponding to the related information.
For the above method, in one possible implementation manner, the avatar related to the picture includes:
the avatar is derived based on an analysis of the user's historical behavior.
According to a second aspect of the present disclosure, there is provided an interface display apparatus, comprising:
the interface movement judging module is used for judging whether the movement meets the triggering condition of highlighting the head portrait when the interface movement is detected;
and the head portrait highlighting module highlights the head portrait related to the picture in the picture contained in the interface when the triggering condition is met.
For the above apparatus, in a possible implementation manner, the interface movement determining module includes:
the judgment submodule judges whether the interface moving speed exceeds a speed threshold value;
and the determining submodule determines that the movement meets the trigger condition when the interface movement speed exceeds a speed threshold.
For the above apparatus, in one possible implementation, the avatar highlighting module includes:
and the first display sub-module is used for covering a floating layer on the picture, wherein the area of the floating layer corresponding to the head portrait is transparent, and the other areas are non-transparent or semitransparent.
For the above apparatus, in one possible implementation, the avatar highlighting module includes:
the second display sub-module performs at least one of the following processing on the area except the area where the head portrait is located on the picture: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening.
For the above apparatus, in one possible implementation, the avatar highlighting module includes:
a third display sub-module, performing at least one of the following processes on an area where the head portrait is located in the picture: the method comprises the steps of improving the brightness of the area where the head portrait is located, improving the chroma of the area where the head portrait is located, enabling the area where the head portrait is located to be displayed in the middle of the picture, enabling the area where the head portrait is located to be displayed dynamically, and enabling the area where the head portrait is located to be displayed in the picture in an enlarged mode.
For the above apparatus, in one possible implementation manner, the avatar highlighting module further includes:
the related information determining submodule is used for determining related information corresponding to the head portrait related to the picture contained in the interface;
and the head portrait determining submodule highlights the head portrait corresponding to the relevant information when the relevant information is matched with the historical behaviors of the user.
For the above apparatus, in one possible implementation manner, the avatar related to the picture includes:
the avatar is derived based on an analysis of the user's historical behavior.
According to a third aspect of the present disclosure, there is provided an interface display apparatus comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute the interface display method.
According to a fourth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer program instructions, wherein the computer program instructions, when executed by a processor, implement the interface presentation method described above.
According to the interface display method and device provided by the embodiment of the disclosure, when the detected movement of the interface meets the trigger condition of highlighting the head portrait, the head portrait related to the screen is highlighted in the screen contained in the interface. The method and the device can ensure that the user determines the content displayed by the picture by checking the highlighted head portrait in the process of checking the picture displayed by the interface, avoid missing the interested content and save the checking time.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a flow diagram of an interface presentation method according to an embodiment of the present disclosure;
fig. 2 shows a flowchart of step S11 in the interface presentation method according to an embodiment of the present disclosure;
FIG. 3 shows a flowchart of step S11 of an interface presentation method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a float layer in an interface display method according to an embodiment of the disclosure;
FIG. 5 shows a schematic diagram of an interface presentation method according to an embodiment of the present disclosure;
FIG. 6 shows a schematic diagram of an interface presentation method according to an embodiment of the present disclosure;
FIG. 7a is a schematic diagram of an application scenario of an interface presentation method according to an embodiment of the present disclosure;
FIG. 7b is a schematic diagram of an application scenario of an interface presentation method according to an embodiment of the present disclosure;
FIG. 8 shows a block diagram of an interface presentation apparatus according to an embodiment of the present disclosure;
FIG. 9 shows a block diagram of an interface presentation apparatus according to an embodiment of the present disclosure;
FIG. 10 shows a block diagram of an interface presentation device according to an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 shows a flowchart of an interface presentation method according to an embodiment of the present disclosure. As shown in fig. 1, the method may be applied to a terminal device, and the method may include steps S11 through S12.
In step S11, when the interface movement is detected, it is determined whether the movement satisfies a trigger condition for highlighting the avatar.
In this embodiment, the speed of the interface movement may be obtained, and then it is determined whether the movement satisfies the trigger condition for highlighting the avatar according to the speed of the interface movement.
In this embodiment, the terminal device may be any device, such as a mobile phone, a tablet computer, a smart watch, a vehicle-mounted terminal, an MP3 player, a VR (Virtual Reality) Head Display, VR glasses, an AR (Augmented Reality) Head Display, AR glasses, an MR (Mixed Reality) Head Display, MR glasses, a HUD (Head Up Display), or a smart television, and the disclosure is not limited thereto. The device has the function of displaying video, audio or other content related to human vision, hearing, smell, touch and taste. The interface may be an interface associated with a multimedia asset such as video, audio, pictures, etc. The user can directly move the interface by sliding a finger. The interface can also be moved by auxiliary control devices such as a handle, a mouse, etc. The user may also move the interface through eye spirit, mind (e.g., brain waves), gestures, etc., with the assistance of a related device, which is not limited by this disclosure.
In step S12, when the trigger condition is satisfied, the avatar relating to the screen is highlighted on the screen included in the interface.
In this embodiment, the trigger condition may be set according to the content of the screen displayed on the interface, the screen viewing speed at which the user can clearly see the content in the screen displayed on the interface, and the like. For example, where the speed of the interface movement exceeds the user's screen viewing speed, it may be determined that the movement satisfies the trigger condition for highlighting the avatar. The present disclosure is not so limited.
In the present embodiment, the avatar related to the picture may be an avatar of a person, an animal, or a cartoon character included in the picture. For example, the head portrait of the contact in instant messaging shown in the picture; the head portrait of a user of an account number in the social network site is displayed in the picture; thumbnails of video resources such as movies and TV shows, posters and head portraits of characters in brief introduction are displayed in the pictures; the head portrait of the person who takes the mirror and the like in the comprehensive art program. But also avatars that are associated with the content of corresponding areas in the picture, but are not shown in the picture. For example, if the content of the corresponding area in the screen is a video resource such as a movie or a tv show, the avatar may be an avatar of an actor playing a role in a thumbnail, a poster, or a brief introduction.
In one possible implementation, the avatar associated with the frame may include: the avatar is derived based on an analysis of the user's historical behavior.
In the implementation mode, the historical behaviors of the user can be analyzed, the characteristics of the head portrait which is possibly interested by the user are determined, and the head portrait which can be highlighted and is related to the picture is generated through specific processing according to the determined characteristics and the content of the picture. The specific processing manner may be to add text, graphics, and other contents related to the contents of the picture (or related to the characteristics of the avatar interested by the user) to the avatar interested by the user, which is not limited by the present disclosure. For example, if it is determined that the user F likes the actor K based on the historical behavior of the user F, when it is determined that the screen content is a series of series in which the actor K has starred, the avatar after processing may be determined as the avatar related to the screen by performing processing of adding characters to the avatar of the hero that the actor K has starred in the series based on the theme of the series or the preference of the user. Or, the head portrait K ' of the actor K may be acquired based on the preference of the user, and the characters related to the tv play may be added to the head portrait K ' according to the content of the tv play, and the head portrait K ' after the characters are added may be determined as the head portrait related to the picture.
In this embodiment, the manner of highlighting the avatar in the screen may include: enhancing the display effect of the region corresponding to the head portrait in the picture, wherein the display effect of the region except the head portrait in the picture is unchanged; reducing the display effect of the area except the head portrait in the picture, wherein the display effect of the area corresponding to the head portrait in the picture is unchanged; and enhancing the display effect of the region corresponding to the head portrait in the picture, and reducing the display effect of the region except the head portrait in the picture so as to achieve the purpose of highlighting the head portrait. The manner of highlighting the avatar may be set by those skilled in the art according to actual needs, and the present disclosure is not limited thereto.
In one possible implementation, in a case where the avatar in the screen is already highlighted in the screen included in the interface, if the movement of the interface shifts from a case where the trigger condition for highlighting the avatar is satisfied to a case where the trigger condition for highlighting the avatar is not satisfied, the avatar in the highlighted screen is held for the holding display time or the avatar in the highlighted screen is stopped in the screen included in the interface. The holding display time may be set according to the complexity of the content corresponding to the avatar, and the more complex the content, the longer the holding time. The present disclosure is not so limited.
Fig. 2 shows a flowchart of step S11 in the interface display method according to an embodiment of the present disclosure.
In one possible implementation, as shown in fig. 2, step S11 may include step S01 and step S02.
In step S01, the related information corresponding to the avatar relating to the screen included in the interface is specified.
In this implementation, the associated information corresponding to the avatar is used to describe the content displayed by the avatar. For example, if the content to be shown corresponding to the avatar is the hero S of a certain movie, the related information of the avatar may include the genre, showing time, score, and name of the role of the hero S and the name of the player of the hero S. Those skilled in the art can set the content included in the related information of the avatar according to actual needs, and the present disclosure does not limit this.
In step S02, when the related information matches the user' S historical behavior, the avatar corresponding to the related information is highlighted.
In this implementation, the historical behavior of the user may be obtained from the browsing record, the retrieval record, and the like of the user. The content which is possibly interested by the user can be determined according to the historical behaviors of the user, and the head portrait of which the associated information is matched with the content which is interested by the user is determined as the highlighted head portrait. For example, if it is determined that the user a likes to watch a comedy movie according to the historical behavior, the head portrait of the comedy movie is included in the highlight associated information in the process of highlighting the head portrait for the user a. And determining that the user B likes the actor M according to the historical behaviors, and highlighting the actor M in the picture in the process of highlighting the head portrait for the user B. The method for acquiring the historical behavior of the user can be set by a person skilled in the art according to actual needs, and the disclosure does not limit this. Therefore, the head portrait which is related to the picture and is possibly interested by the user can be highlighted for the user according to the historical behaviors of the user, the user is prevented from missing the interested content, and the selection time of the user is saved.
Fig. 3 shows a flowchart of step S11 of the interface presentation method according to an embodiment of the present disclosure.
In one possible implementation, as shown in fig. 3, step S11 may include step S111 and step S112.
In step S111, it is determined whether the interface movement speed exceeds a speed threshold.
In this implementation, the speed threshold may be determined according to the complexity of the content displayed in the screen included in the interface and the reading speed of the user for the content with different complexities. The more and more complicated the content displayed on the screen, the slower the reading and understanding speed of the user, and the smaller the value of the speed threshold.
In step S112, when the interface movement speed exceeds the speed threshold, it is determined that the movement satisfies the trigger condition.
Therefore, when the interface moving speed exceeds the speed threshold value, the head portrait related to the picture contained in the interface can be highlighted for the user, and the user can know the specific content in the picture contained in the interface under the condition that the interface moving speed is high.
Fig. 4 is a schematic diagram illustrating a floating layer in an interface display method according to an embodiment of the disclosure.
In one possible implementation, as shown in fig. 4, the highlighting of the avatar associated with the picture in step S12 may include: a floating layer is covered on the picture, the area 1 of the floating layer corresponding to the head portrait is transparent, and the other areas 2 are non-transparent or semitransparent.
In this implementation manner, the other areas 2 in the floating layer may also be processed by filling colors, patterns, and the like to increase the eye-catching degree of the avatar, and remind the user of the content displayed on the screen corresponding to the avatar. The transparency and filling of the other regions 2 in the float layer can be set by those skilled in the art according to actual needs, and the present disclosure does not limit this.
In one possible implementation, as shown in fig. 5, the highlighting of the avatar related to the picture in step S12 may further include: and performing at least one of the following processes on the area 3 except the area where the head portrait is located on the picture: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening. Fig. 5 shows a schematic diagram of an interface presentation method according to an embodiment of the present disclosure. As shown in fig. 5, the brightness of the area 3 other than the area where the avatar is located is darkened. Therefore, the attention of the user to the area 3 except the area where the avatar is located can be reduced, and the attention of the user to the avatar is further improved, so that the user can determine whether multimedia resources such as a movie, a television, a picture album, audio and the like corresponding to the avatar are interesting contents based on the avatar.
It should be understood that other treatments can be performed on the area 3 according to actual needs by those skilled in the art, and the present disclosure does not limit this.
Fig. 6 shows a schematic diagram of an interface presentation method according to an embodiment of the present disclosure.
In one possible implementation, as shown in fig. 6, the highlighting of the avatar related to the picture in step S12 may further include: at least one of the following processes is carried out on the area where the head portrait is located in the picture: the method comprises the steps of improving the brightness of an area 1 where the head portrait is located, improving the chromaticity of the area 1 where the head portrait is located, enabling the area 1 where the head portrait is located to be displayed in the middle of a picture, enabling the area 1 where the head portrait is located to be displayed dynamically, and enabling the area 1 where the head portrait is located to be displayed in the picture in an enlarged mode.
In this embodiment, in the case where only one avatar is included in the screen, the region where the avatar is located may be directly displayed in the middle portion of the entire screen. In the case where a plurality of avatars are included in the screen, the area where the avatar is located may be displayed in the center and/or enlarged in the area where the multimedia resource such as a movie or a tv show corresponding to the avatar is located in the screen. It should be understood that, the display mode of the region where the avatar is located can be set by those skilled in the art according to actual needs, and the present disclosure does not limit this.
The area where the avatar is located may be any shape area containing the avatar, such as a circular area, a polygonal area, etc., and the present disclosure does not limit this.
Application example
An application example according to the embodiment of the present disclosure is given below in conjunction with "filter movie" as an exemplary application scenario to facilitate understanding of the flow of the interface presentation method. It is to be understood by those skilled in the art that the following application examples are for the purpose of facilitating understanding of the embodiments of the present disclosure only and are not to be construed as limiting the embodiments of the present disclosure.
Fig. 7a and 7b are schematic diagrams illustrating an application scenario of an interface presentation method according to an embodiment of the present disclosure. As shown in fig. 7a, without using the interface display method provided by the present disclosure, in the process of screening movies by a certain video client software, a user only displays the movies to be selected in the interface, and does not perform processing of highlighting the movies. In the prior art, in the process of fast moving through an operation control interface by a user, the contents of elements and the like in the interface are not changed, and the elements and the like are only fast moved along the direction indicated by the user. But due to the fast moving speed, the user is likely to miss the movie in which the user is interested.
As shown in fig. 7b, in the case of adopting the interface display method provided by the present disclosure, in the process of screening movies by a certain video client software, a user controls the movement of the interface by sliding the interface with a finger. And when the interface moving speed exceeds a speed threshold value, acquiring an avatar related to a picture contained in the current interface, highlighting the avatar, and controlling the dimming of the area except the avatar. Under the condition that the interface moving speed is high, a user can know the specific content of the film according to the head portrait displayed in the area corresponding to each film, interested content cannot be missed, and the viewing requirement of the user is met.
It should be noted that, although the interface display method is described above by taking the above-mentioned embodiment as an example, those skilled in the art can understand that the disclosure should not be limited thereto. In fact, the user can flexibly set each step according to personal preference and/or actual application scene, as long as the technical scheme of the disclosure is met.
According to the interface display method provided by the embodiment of the disclosure, when the detected movement of the interface meets the trigger condition of highlighting the head portrait, the head portrait related to the screen is highlighted in the screen contained in the interface. The method and the device can ensure that the user determines the content displayed by the picture by checking the highlighted head portrait in the process of checking the picture displayed by the interface, avoid missing the interested content and save the checking time.
FIG. 8 shows a block diagram of an interface presentation device according to an embodiment of the present disclosure. As shown in fig. 8, the apparatus may include an interface movement determination module 401 and an avatar highlighting module 402. The interface movement determination module 401 is configured to determine whether the movement satisfies a trigger condition for highlighting the avatar when the interface movement is detected; the avatar highlighting module 402 is configured to highlight an avatar associated with a frame in the frames contained in the interface when a trigger condition is satisfied.
FIG. 9 shows a block diagram of an interface presentation device according to an embodiment of the present disclosure.
In one possible implementation, as shown in fig. 9, the interface movement determination module 401 may include a determination sub-module 4011 and a determination sub-module 4012. The determination sub module 4011 is configured to determine whether the interface moving speed exceeds a speed threshold. The determination sub-module 4012 is configured to determine that the movement satisfies the trigger condition when the speed of the interface movement exceeds a speed threshold.
In one possible implementation, as shown in fig. 9, the avatar highlighting module 402 may include a first display sub-module 4021. The first display sub-module 4021 is configured to cover a floating layer on a screen, where a region of the floating layer corresponding to the avatar is transparent, and other regions are non-transparent or semi-transparent.
In one possible implementation, as shown in fig. 9, the avatar highlighting module 402 may include a second display sub-module 4022. The second display sub-module 4022 is configured to perform at least one of the following processing for an area other than the area where the avatar is located on the screen: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening.
In one possible implementation, as shown in fig. 9, the avatar highlighting module 402 may include a third display sub-module 4023. The third display sub-module 4023 is configured to perform at least one of the following processes on an area of the screen where the avatar is located: the method comprises the steps of improving the brightness of the area where the head portrait is located, improving the chroma of the area where the head portrait is located, enabling the area where the head portrait is located to be displayed in the middle of the picture, enabling the area where the head portrait is located to be displayed dynamically, and enabling the area where the head portrait is located to be displayed in the picture in an enlarged mode.
In one possible implementation, as shown in fig. 9, the avatar highlighting module 402 may include a correlation information determination sub-module 4024 and an avatar determination sub-module 4025. The related information determining sub-module 4024 is configured to determine related information corresponding to a head portrait related to a screen included in the interface. The avatar determination sub-module 4025 is configured to highlight the avatar corresponding to the associated information if the associated information matches the historical behavior of the user.
In one possible implementation, the avatar associated with the frame may include: the avatar is derived based on an analysis of the user's historical behavior.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
It should be noted that, although the interface display device is described above by taking the above-mentioned embodiment as an example, those skilled in the art can understand that the disclosure should not be limited thereto. In fact, the user can flexibly set each part according to personal preference and/or actual application scene as long as the technical scheme of the disclosure is met.
According to the interface display device provided by the embodiment of the disclosure, under the condition that the detected movement of the interface meets the trigger condition of highlighting the head portrait, the head portrait related to the screen is highlighted in the screen contained in the interface. The method and the device can ensure that the user determines the content displayed by the picture by checking the highlighted head portrait in the process of checking the picture displayed by the interface, avoid missing the interested content and save the checking time.
Fig. 10 shows a block diagram of an interface presentation apparatus 800 according to an embodiment of the present disclosure. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 10, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user, in some embodiments, the screen may include a liquid crystal display (L CD) and a Touch Panel (TP). if the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), programmable logic devices (P L D), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components for performing the methods described above.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the device 800 to perform the above-described methods.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (14)

1. An interface display method is used for terminal equipment, and comprises the following steps:
when the interface movement is detected, judging whether the movement meets the triggering condition of highlighting the head portrait;
when the triggering condition is met, highlighting and displaying the head portrait relevant to the picture in the pictures contained in the interface, wherein a plurality of pictures are displayed in the interface,
wherein highlighting the avatar associated with the frame comprises:
determining relevant information corresponding to head portraits related to pictures contained in the interface;
and when the related information is matched with the historical behaviors of the user, highlighting the head portrait corresponding to the related information.
2. The method of claim 1, wherein determining whether the movement satisfies a trigger condition for highlighting the avatar comprises:
judging whether the interface moving speed exceeds a speed threshold value or not;
determining that the movement satisfies the trigger condition when the interface movement speed exceeds a speed threshold.
3. The method of claim 1, wherein highlighting the avatar associated with the frame comprises:
and covering a floating layer on the picture, wherein the region of the floating layer corresponding to the head portrait is transparent, and the other regions are non-transparent or semitransparent.
4. The method of claim 1, wherein highlighting the avatar associated with the frame comprises:
performing at least one of the following processing on the area except the area where the head portrait is located on the picture: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening.
5. The method of claim 1 or 4, wherein highlighting the avatar associated with the frame comprises:
at least one of the following processing is carried out on the area where the head portrait is located in the picture: the method comprises the steps of improving the brightness of the area where the head portrait is located, improving the chroma of the area where the head portrait is located, enabling the area where the head portrait is located to be displayed in the middle of the picture, enabling the area where the head portrait is located to be displayed dynamically, and enabling the area where the head portrait is located to be displayed in the picture in an enlarged mode.
6. The method of claim 1, wherein the avatar associated with the frame comprises:
the avatar is derived based on an analysis of the user's historical behavior.
7. An interface display device, comprising:
the interface movement judging module is used for judging whether the movement meets the triggering condition of highlighting the head portrait when the interface movement is detected;
the head portrait highlighting module highlights the head portrait related to the picture in the pictures contained in the interface when the triggering condition is met, the interface displays a plurality of pictures,
wherein the avatar highlighting module comprises:
the related information determining submodule is used for determining related information corresponding to the head portrait related to the picture contained in the interface;
and the head portrait determining submodule highlights the head portrait corresponding to the relevant information when the relevant information is matched with the historical behaviors of the user.
8. The apparatus of claim 7, wherein the interface movement determination module comprises:
the judgment submodule judges whether the interface moving speed exceeds a speed threshold value;
and the determining submodule determines that the movement meets the trigger condition when the interface movement speed exceeds a speed threshold.
9. The apparatus of claim 7, wherein the avatar highlighting module comprises:
and the first display sub-module is used for covering a floating layer on the picture, wherein the area of the floating layer corresponding to the head portrait is transparent, and the other areas are non-transparent or semitransparent.
10. The apparatus of claim 7, wherein the avatar highlighting module comprises:
the second display sub-module performs at least one of the following processing on the area except the area where the head portrait is located on the picture: mosaic adding, frame drawing, cutting, deleting, blurring, graying, darkening, and blackening.
11. The apparatus of claim 7 or 10, wherein the avatar highlighting module comprises:
a third display sub-module, performing at least one of the following processes on an area where the head portrait is located in the picture: the method comprises the steps of improving the brightness of the area where the head portrait is located, improving the chroma of the area where the head portrait is located, enabling the area where the head portrait is located to be displayed in the middle of the picture, enabling the area where the head portrait is located to be displayed dynamically, and enabling the area where the head portrait is located to be displayed in the picture in an enlarged mode.
12. The apparatus of claim 7, wherein the avatar associated with the frame comprises:
the avatar is derived based on an analysis of the user's historical behavior.
13. An interface display device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any of claims 1 to 6.
14. A non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method of any of claims 1 to 6.
CN201711121352.2A 2017-11-14 2017-11-14 Interface display method and device Active CN107908324B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201711121352.2A CN107908324B (en) 2017-11-14 2017-11-14 Interface display method and device
PCT/CN2018/105375 WO2019095815A1 (en) 2017-11-14 2018-09-13 Interface display method and apparatus
TW107138523A TWI673644B (en) 2017-11-14 2018-10-31 Interface display method, interface display device and non-volatile computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711121352.2A CN107908324B (en) 2017-11-14 2017-11-14 Interface display method and device

Publications (2)

Publication Number Publication Date
CN107908324A CN107908324A (en) 2018-04-13
CN107908324B true CN107908324B (en) 2020-07-14

Family

ID=61845275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711121352.2A Active CN107908324B (en) 2017-11-14 2017-11-14 Interface display method and device

Country Status (3)

Country Link
CN (1) CN107908324B (en)
TW (1) TWI673644B (en)
WO (1) WO2019095815A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107908324B (en) * 2017-11-14 2020-07-14 阿里巴巴(中国)有限公司 Interface display method and device
CN107908325B (en) * 2017-11-14 2020-06-12 阿里巴巴(中国)有限公司 Interface display method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105407385A (en) * 2015-10-29 2016-03-16 桂林创研科技有限公司 Video recommendation method and system
CN106777116A (en) * 2016-12-15 2017-05-31 腾讯科技(深圳)有限公司 A kind of content acquisition method, subscription client, server and system
CN106970973A (en) * 2017-03-24 2017-07-21 联想(北京)有限公司 A kind of information processing method, device and electronic equipment

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100138648A (en) * 2009-06-25 2010-12-31 삼성전자주식회사 Image processing apparatus and method
US8818025B2 (en) * 2010-08-23 2014-08-26 Nokia Corporation Method and apparatus for recognizing objects in media content
US10255227B2 (en) * 2012-05-21 2019-04-09 Oath Inc. Computerized system and method for authoring, editing, and delivering an interactive social media video
CN103577479B (en) * 2012-08-06 2015-08-19 腾讯科技(深圳)有限公司 Web page contents display packing and system
CN103686223B (en) * 2012-09-11 2018-05-01 上海聚力传媒技术有限公司 A kind of method and apparatus that video access service is provided according to field feedback
US9542090B2 (en) * 2013-05-10 2017-01-10 Egalax_Empia Technology Inc. Electronic device, processing module, and method for detecting touch trace starting beyond touch area
KR20140133362A (en) * 2013-05-10 2014-11-19 삼성전자주식회사 display apparatus and user interface screen providing method thereof
US20150159402A1 (en) * 2013-12-06 2015-06-11 Itai Yahav Self-contained electronic cylinder lock alternatively operable by a key
CN104978115B (en) * 2014-04-02 2019-09-20 腾讯科技(深圳)有限公司 Content display method and device
WO2015171600A1 (en) * 2014-05-06 2015-11-12 Brewer Science Inc. User interface, method, and computer program for displaying data
CN104461236A (en) * 2014-11-07 2015-03-25 小米科技有限责任公司 Method and device for displaying application icons
CN105786878B (en) * 2014-12-24 2020-11-03 深圳市腾讯计算机系统有限公司 Display method and device of browsing object
CN105786352B (en) * 2014-12-26 2019-08-06 阿里巴巴集团控股有限公司 The method, device and mobile terminal of quick positioning webpage content
CN105117463B (en) * 2015-08-24 2019-08-06 北京旷视科技有限公司 Information processing method and information processing unit
CN111784615A (en) * 2016-03-25 2020-10-16 北京三星通信技术研究有限公司 Method and device for processing multimedia information
CN106231378A (en) * 2016-07-28 2016-12-14 北京小米移动软件有限公司 The display packing of direct broadcasting room, Apparatus and system
CN106067992B (en) * 2016-08-18 2019-07-26 北京奇虎科技有限公司 A kind of information recommendation method and device based on user behavior
CN107908324B (en) * 2017-11-14 2020-07-14 阿里巴巴(中国)有限公司 Interface display method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105407385A (en) * 2015-10-29 2016-03-16 桂林创研科技有限公司 Video recommendation method and system
CN106777116A (en) * 2016-12-15 2017-05-31 腾讯科技(深圳)有限公司 A kind of content acquisition method, subscription client, server and system
CN106970973A (en) * 2017-03-24 2017-07-21 联想(北京)有限公司 A kind of information processing method, device and electronic equipment

Also Published As

Publication number Publication date
TWI673644B (en) 2019-10-01
TW201918852A (en) 2019-05-16
CN107908324A (en) 2018-04-13
WO2019095815A1 (en) 2019-05-23

Similar Documents

Publication Publication Date Title
US11099704B2 (en) Mobile terminal and control method for displaying images from a camera on a touch screen of the mobile terminal
CN105955607B (en) Content sharing method and device
EP3113001A1 (en) Method and apparatus for displaying information
US20190163976A1 (en) Method, apparatus, and storage medium for searching for object using augmented reality (ar)
TWI684128B (en) Interface display method and device and non-volatile computer readable storage medium
CN109358785B (en) Theme preview method and device
CN106227419A (en) Screenshotss method and device
US11372516B2 (en) Method, device, and storage medium for controlling display of floating window
CN111479158B (en) Video display method and device, electronic equipment and storage medium
EP3032482A1 (en) Page display method and apparatus
TW201918857A (en) Interface display method and device
CN111783175A (en) Display interface privacy protection method, terminal and computer readable storage medium
TWI708506B (en) Video playback method and device
CN107908325B (en) Interface display method and device
CN107908324B (en) Interface display method and device
CN110321042B (en) Interface information display method and device and electronic equipment
CN113625909A (en) Application page display method and device, electronic equipment and storage medium
TWI708179B (en) Interface display method and device
CN114095611B (en) Processing method and device of caller identification interface, electronic equipment and storage medium
CN106371714B (en) Information display method and device
US20200272291A1 (en) Interface display method and apparatus
CN111782321A (en) Method, device and medium for checking page hierarchical structure
CN117474746A (en) Window display method and device, electronic equipment, storage medium and chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1253069

Country of ref document: HK

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200428

Address after: 310052 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 100080 Beijing Haidian District city Haidian street A Sinosteel International Plaza No. 8 block 5 layer A, C

Applicant before: Youku network technology (Beijing) Co., Ltd

GR01 Patent grant
GR01 Patent grant