CN113535310B - Chat bubble display control method and device, electronic equipment and storage medium - Google Patents

Chat bubble display control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113535310B
CN113535310B CN202110858048.6A CN202110858048A CN113535310B CN 113535310 B CN113535310 B CN 113535310B CN 202110858048 A CN202110858048 A CN 202110858048A CN 113535310 B CN113535310 B CN 113535310B
Authority
CN
China
Prior art keywords
target
color value
chat
user
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110858048.6A
Other languages
Chinese (zh)
Other versions
CN113535310A (en
Inventor
王帅
赵作通
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202110858048.6A priority Critical patent/CN113535310B/en
Publication of CN113535310A publication Critical patent/CN113535310A/en
Application granted granted Critical
Publication of CN113535310B publication Critical patent/CN113535310B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a chat bubble display control method and device, electronic equipment and storage medium. The method comprises the following steps: acquiring an avatar of a first user; determining style information of target chat bubbles in a target chat interface according to the head portraits, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target users in the target chat interface; and controlling and displaying the target chat bubble according to the style information.

Description

Chat bubble display control method and device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of internet, and in particular relates to a chat bubble display control method and device, electronic equipment and a storage medium.
Background
With the development of communication technology, terminal devices such as smart phones and tablet computers are becoming more and more popular. Through the terminal device, people can perform various activities such as shopping, entertainment, communication, and the like. Among them, communication through a terminal device has become one of the most important communication modes for people. In a chat interface, messages sent by users are typically carried by chat bubbles.
Disclosure of Invention
The present disclosure provides a technical solution for controlling the display of chat bubbles.
According to an aspect of the present disclosure, there is provided a display control method of chat bubbles, including:
acquiring an avatar of a first user;
determining style information of target chat bubbles in a target chat interface according to the head portraits, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target users in the target chat interface;
and controlling and displaying the target chat bubble according to the style information.
The method comprises the steps of obtaining an avatar of a first user, determining style information of target chat bubbles in a target chat interface according to the avatar, and controlling and displaying the target chat bubbles according to the style information, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target user in the target chat interface, so that the style of displaying the chat bubbles can be controlled based on the avatar of the user, and the flexibility of display control of the chat bubbles is improved. That is, as the user changes the avatar, the chat bubble pattern can be changed accordingly, so that the chat bubble pattern conforming to the style of the user avatar can be obtained, the chat bubble can be more coordinated with the avatar of the user, and the chat interface can be richer. In addition, differentiated and personalized chat bubbles can be realized, so that the chat experience of the user can be improved.
In one possible implementation, the first user comprises a peer user of the targeted chat interface.
In this implementation manner, by acquiring the head portraits of the opposite end users of the target chat interface, determining the style information of the target chat bubbles in the target chat interface according to the head portraits of the opposite end users, and controlling and displaying the target chat bubbles according to the style information, in the chat interface between the local end users and the opposite end users, the style of the chat bubbles can be controlled and displayed based on the head portraits of the opposite end users, so that the chat bubbles in the chat interface between the local end users and the opposite end users and the head portraits of the opposite end users can be dynamically connected, that is, the style (for example, the color) of the chat bubbles in the chat interface between the local end users and the opposite end users can be changed along with the change of the head portraits of the opposite end users, thereby enhancing the interestingness and enabling the local end users to feel stronger and to quickly perceive who is currently proceeding with the chat, thereby being beneficial to saving the time of the local end users.
In one possible implementation, the target user includes a home user of the target chat interface or the first user.
In the implementation manner, the head portraits of the opposite end users in the target chat interface are obtained, the style information of chat bubbles corresponding to the messages sent by the local end users in the target chat interface is determined according to the head portraits of the opposite end users, and the chat bubbles corresponding to the messages sent by the local end users are controlled to be displayed according to the style information. The method comprises the steps of obtaining the head portraits of opposite end users in a target chat interface, determining the style information of chat bubbles corresponding to messages sent by the local end users in the target chat interface according to the head portraits of the opposite end users, and controlling and displaying the chat bubbles corresponding to the messages sent by the local end users according to the style information.
In one possible implementation, the style information includes a color value of the target chat bubble;
the determining the style information of the target chat bubble in the target chat interface according to the head portrait comprises the following steps:
determining a color value of the avatar;
and determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait.
In this embodiment, the color value of the chat bubble to be displayed is controlled according to the color value of the first user's avatar, whereby the color of the chat bubble can be controlled and displayed based on the color of the user's avatar. The color of the chat bubble can be changed along with the head portrait replacement of the user, so that the color of the chat bubble which is consistent with the color of the head portrait of the user can be obtained, the chat bubble and the head portrait of the user can be more coordinated, and the chat interface can be richer.
In one possible implementation, the determining the color value of the avatar includes:
performing fuzzy processing on the head portrait to obtain a head portrait after the fuzzy processing;
and determining the color value of the head portrait according to the head portrait after the blurring processing.
In this implementation, by determining the color value of the head portrait from the blurred head portrait, the color value of the head portrait thus determined can more accurately reflect the dominant hue of the head portrait. Determining the color value of the target chat bubble based on the color value of the avatar thus determined enables the target chat bubble to be more coordinated with the avatar.
In one possible implementation manner, the determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait includes:
determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value.
In this implementation manner, the color value of the target chat bubble in the target chat interface is determined according to the first target color value adjacent to the color value of the head portrait from the preset multiple target color values, so that the color which is more suitable for the chat bubble can be selected on the basis of the color of the head portrait of the user, and the user chat experience can be improved.
In one possible implementation manner, the determining the color value of the target chat bubble in the target chat interface according to the first target color value includes:
adjusting the first target color value to obtain a second target color value;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value.
By adopting the implementation mode, the target chat bubble in the target chat interface achieves the effect of gradual change of color.
In one possible implementation manner, the determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value includes:
acquiring the position information of the target chat bubble in the target chat interface;
and determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information, wherein the color value of the target chat bubble is between the first target color value and the second target color value.
According to the implementation mode, the target chat bubbles in the target chat interface can achieve a more natural gradient color effect.
According to an aspect of the present disclosure, there is provided a display control device of chat bubbles, including:
the acquisition module is used for acquiring the head portrait of the first user;
the determining module is used for determining style information of target chat bubbles in a target chat interface according to the head portrait, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target user in the target chat interface;
And the display control module is used for controlling and displaying the target chat bubbles according to the style information.
In one possible implementation, the first user comprises a peer user of the targeted chat interface.
In one possible implementation, the target user includes a home user of the target chat interface or the first user.
In one possible implementation, the style information includes a color value of the target chat bubble;
the determining module is used for:
determining a color value of the avatar;
and determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait.
In one possible implementation, the determining module is configured to:
performing fuzzy processing on the head portrait to obtain a head portrait after the fuzzy processing;
and determining the color value of the head portrait according to the head portrait after the blurring processing.
In one possible implementation, the determining module is configured to:
determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value.
In one possible implementation, the determining module is configured to:
adjusting the first target color value to obtain a second target color value;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value.
In one possible implementation, the determining module is configured to:
acquiring the position information of the target chat bubble in the target chat interface;
and determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information, wherein the color value of the target chat bubble is between the first target color value and the second target color value.
According to an aspect of the present disclosure, there is provided an electronic apparatus including: one or more processors; a memory for storing executable instructions; wherein the one or more processors are configured to invoke the executable instructions stored by the memory to perform the above-described method.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
In the embodiment of the disclosure, the style information of the target chat bubble in the target chat interface is determined by acquiring the head portrait of the first user, and the target chat bubble is controlled to be displayed according to the style information, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubble represents the chat bubble corresponding to the message sent by the target user in the target chat interface, so that the style of the chat bubble can be controlled to be displayed based on the head portrait of the user, and the flexibility of the display control of the chat bubble is improved. That is, as the user changes the avatar, the chat bubble pattern can be changed accordingly, so that the chat bubble pattern conforming to the style of the user avatar can be obtained, the chat bubble can be more coordinated with the avatar of the user, and the chat interface can be richer. In addition, the embodiment of the disclosure can also realize differentiated and personalized chat bubbles, so that the chat experience of the user can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the technical aspects of the disclosure.
Fig. 1 shows a flowchart of a chat bubble display control method provided by an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of a layer of chat bubbles in a chat bubble display control method according to an embodiment of the disclosure.
Fig. 3 illustrates a schematic diagram of a chat bubble display control method according to an embodiment of the disclosure.
Fig. 4 illustrates another schematic diagram of a chat bubble display control method provided by an embodiment of the present disclosure.
Fig. 5 shows a block diagram of a display control apparatus for chat bubbles provided by an embodiment of the present disclosure.
Fig. 6 shows a block diagram of an electronic device 800 provided by an embodiment of the present disclosure.
Fig. 7 shows a block diagram of an electronic device 1900 provided by an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, numerous specific details are set forth in the following detailed description in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements, and circuits well known to those skilled in the art have not been described in detail in order not to obscure the present disclosure.
The chat bubble display control method and device, the electronic equipment and the storage medium provided by the embodiment of the disclosure can be applied to any software and/or website with chat function. The software with the chat function may be mobile phone software (also referred to as APP, application program) with the chat function, tablet computer software with the chat function, PC (Personal Computer ) software, or the like, which is not limited herein. For example, the mobile phone software with chat function may include instant messaging software, short message software, short video APP with private message function, and so on. The website with chat function may be a chat room, a social platform website with private letter function, etc., without limitation.
The method provided by the embodiments of the present disclosure is described in detail below with reference to the accompanying drawings. Fig. 1 shows a flowchart of a chat bubble display control method provided by an embodiment of the present disclosure. In one possible implementation manner, the chat bubble display control method may be executed by a terminal device and/or a server of a server. The terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or the like. In some possible implementations, the chat bubble display control method may be implemented by a processor calling computer readable instructions stored in a memory. As shown in fig. 1, the chat bubble display control method includes steps S11 to S13.
In step S11, an avatar of the first user is acquired.
In step S12, style information of a target chat bubble in a target chat interface is determined according to the avatar, where the target chat interface represents a chat interface including the first user, and the target chat bubble represents a chat bubble corresponding to a message sent by a target user in the target chat interface;
In step S13, the target chat bubble is controlled to be displayed according to the style information.
In the disclosed embodiments, the first user may represent any user participating in a chat. The first user may comprise a peer user and/or a home user of the targeted chat interface. The avatar may represent an image on the social platform for identification. The avatar of the first user may be a custom avatar or a default avatar. The first user-defined head portrait can be a photograph of the first user, an avatar, or any image of a star character, a cartoon character, a scenic map, etc. that the first user likes. In the case where the first user does not customize the avatar, the first user's avatar may be a default avatar. In the chat interface of the first user with other users, an avatar of the first user may be displayed. For example, if the first user is the opposite end user of the target chat interface, the head portrait of the first user may be displayed on the left side, the upper side, or the inside of the chat bubble corresponding to the message sent by the first user. For another example, if the first user is the home terminal user of the target chat interface, the head portrait of the first user may be displayed on the right side, above or in the chat bubble corresponding to the message sent by the first user. The number of the opposite end users of the target chat interface can be one or more than two.
In the embodiment of the disclosure, the chat interface may be a communication interface of instant messaging software, a short message page, a private message page, and the like. The target chat interface includes a first user and may also include one or more other users. The target user may comprise any user in the target chat interface. The target users may include some or all of the users in the target chat interface. The number of target users may be one or more than two. For example, the target user may be one of the users of the target chat interface. In one possible implementation, the target user may comprise a first user. In another possible implementation, the target user may include some or all of the users in the target chat interface other than the first user.
In the embodiment of the disclosure, style information of chat bubbles corresponding to a message sent by a target user in a target chat interface is determined according to an avatar of a first user. Wherein, chat bubbles can represent a carrier of messages sent by users in the chat interface. That is, a message sent by the user may be displayed in the chat bubble. For example, chat bubbles corresponding to messages sent by the target users can represent carriers of messages sent by the target users in the target chat interface. In some application scenarios, chat bubbles may also be referred to as message bubbles, private letter bubbles, and the like, without limitation herein. In the embodiment of the present disclosure, the style information of the chat bubble may include at least one of a color value, a shape, a pattern, and the like of the chat bubble.
In the embodiment of the disclosure, the style information of the target chat bubble in the target chat interface is determined by acquiring the head portrait of the first user, and the target chat bubble is controlled to be displayed according to the style information, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubble represents the chat bubble corresponding to the message sent by the target user in the target chat interface, so that the style of the chat bubble can be controlled to be displayed based on the head portrait of the user, and the flexibility of the display control of the chat bubble is improved. That is, as the user changes the avatar, the chat bubble pattern can be changed accordingly, so that the chat bubble pattern conforming to the style of the user avatar can be obtained, the chat bubble can be more coordinated with the avatar of the user, and the chat interface can be richer. In addition, the embodiment of the disclosure can also realize differentiated and personalized chat bubbles, so that the chat experience of the user can be improved.
In one possible implementation, the first user comprises a peer user of the targeted chat interface. In this implementation manner, by acquiring the head portraits of the opposite end users of the target chat interface, determining the style information of the target chat bubbles in the target chat interface according to the head portraits of the opposite end users, and controlling and displaying the target chat bubbles according to the style information, in the chat interface between the local end users and the opposite end users, the style of the chat bubbles can be controlled and displayed based on the head portraits of the opposite end users, so that the chat bubbles in the chat interface between the local end users and the opposite end users and the head portraits of the opposite end users can be dynamically connected, that is, the style (for example, the color) of the chat bubbles in the chat interface between the local end users and the opposite end users can be changed along with the change of the head portraits of the opposite end users, thereby enhancing the interestingness and enabling the local end users to feel stronger and to quickly perceive who is currently proceeding with the chat, thereby being beneficial to saving the time of the local end users.
As an example of this implementation, the target user includes a home end user of the target chat interface. In this example, by acquiring the head portraits of the opposite end users in the target chat interface, determining the style information of chat bubbles corresponding to the messages sent by the local end users in the target chat interface according to the head portraits of the opposite end users, and controlling and displaying the chat bubbles corresponding to the messages sent by the local end users according to the style information, in the chat interface between the local end users and the opposite end users, the style of chat bubbles corresponding to the messages sent by the local end users can be controlled and displayed based on the head portraits of the opposite end users, so that the chat bubbles corresponding to the messages sent by the local end users and the head portraits of the opposite end users can be dynamically linked, namely, the style of chat bubbles corresponding to the messages sent by the local end users can be changed along with the change of the head portraits of the opposite end users, and the local end users can be helped to quickly know to whom the messages are currently sent.
As another example of this implementation, the target user includes the first user. In this example, by acquiring the head portraits of the opposite end users in the target chat interface, determining the style information of chat bubbles corresponding to the messages sent by the opposite end users in the target chat interface according to the head portraits of the opposite end users, and controlling and displaying the chat bubbles corresponding to the messages sent by the opposite end users according to the style information, in the chat interface between the local end users and the opposite end users, the style of the chat bubbles corresponding to the messages sent by the opposite end users can be controlled and displayed based on the head portraits of the opposite end users, so that the chat bubbles corresponding to the messages sent by the opposite end users and the head portraits of the opposite end users can be dynamically linked, namely, the style of the chat bubbles corresponding to the messages sent by the opposite end users can be changed along with the change of the head portraits of the opposite end users, and the local end users can be helped to quickly know about the messages sent by the current end users.
In another possible implementation, the first user comprises a home user of the target chat interface.
In another possible implementation, the first user includes a peer user and a home user of the target chat interface.
In one possible implementation, the style information includes a color value of the target chat bubble; the determining the style information of the target chat bubble in the target chat interface according to the head portrait comprises the following steps: determining a color value of the avatar; and determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait. In this implementation, the color value may represent a color value to which the color corresponds in the color mode. The color value of the head portrait can comprise one or more than two color values, and the color value of the target chat bubble can also comprise one or more than two color values. In this embodiment, the color value of the chat bubble to be displayed is controlled according to the color value of the first user's avatar, whereby the color of the chat bubble can be controlled and displayed based on the color of the user's avatar. The color of the chat bubble can be changed along with the head portrait replacement of the user, so that the color of the chat bubble which is consistent with the color of the head portrait of the user can be obtained, the chat bubble and the head portrait of the user can be more coordinated, and the chat interface can be richer.
As an example of this implementation, the determining the color value of the avatar includes: performing fuzzy processing on the head portrait to obtain a head portrait after the fuzzy processing; and determining the color value of the head portrait according to the head portrait after the blurring processing. In this example, the head portrait may be blurred by mean blurring, gaussian blurring, or the like, to obtain a blurred image. By determining the color value of the head portrait according to the head portrait after the blurring process, the color value of the head portrait determined by the method can reflect the dominant hue of the head portrait more accurately. Determining the color value of the target chat bubble based on the color value of the avatar thus determined enables the target chat bubble to be more coordinated with the avatar.
In one example, the determining the color value of the head portrait according to the head portrait after the blurring process includes: determining the area occupied by at least one color in the head portrait after the blurring process; and determining the color value of the head portrait according to the at least one color and the occupied area of the at least one color. For example, the areas occupied by the respective colors in the head portrait after the blurring process may be determined, and the color values of the head portrait may be determined according to the color value of one or more colors with the largest occupied area. The color value of the head portrait is determined according to at least one color and the occupied area of the at least one color in the head portrait after the blurring processing, and the determined color value of the head portrait can reflect the dominant hue of the head portrait more accurately. Determining the color value of the target chat bubble based on the color value of the avatar thus determined enables the target chat bubble to be more coordinated with the avatar.
For example, the determining the color value of the head portrait according to the at least one color and the area occupied by the at least one color includes: and determining the color value of the color with the largest occupied area in the head portrait after the blurring processing as the color value of the head portrait. In this example, by determining the color value of the color with the largest area occupied in the blurred head portrait as the color value of the head portrait, the determined color value of the head portrait can accurately reflect not only the dominant hue of the head portrait but also reduce the computational complexity.
As another example, the determining the color value of the head portrait according to the at least one color and the area occupied by the at least one color includes: and determining a weighted average value of the color values of M colors with the largest occupied area in the head portrait after the blurring processing as the color value of the head portrait, wherein M is an integer greater than 1. The weight corresponding to any color can be positively correlated with the area occupied by the color in the head portrait.
In another example, the determining the color value of the head portrait according to the head portrait after the blurring process includes: determining the area proportion of at least one color in the head portrait after the blurring process; and determining the color value of the head portrait according to the at least one color and the area proportion occupied by the at least one color. In the process of determining the area ratio of the colors in the head portrait after the blurring process, if the area ratio of one color is determined to be more than 50%, the area ratio of other colors can be not determined.
As another example of this implementation, the avatar may not need to be blurred. For example, the color value of the avatar may be determined directly from the area occupied by at least one color in the avatar and the at least one color. For example, the color value of the color having the largest area occupied in the head portrait may be determined as the color value of the head portrait. For another example, a weighted average of the color values of M colors with the largest area occupied in the head portrait may be determined as the color value of the head portrait.
As an example of this implementation, the determining, according to the color value of the avatar, the color value of the target chat bubble in the target chat interface includes: determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values; and determining the color value of the target chat bubble in the target chat interface according to the first target color value. In this example, the preset plurality of target color values may be preset plurality of cleaner color values, which is not limited herein. The first target color value represents a target color value adjacent to the color value of the head portrait in a plurality of preset target color values. The color value of the target chat bubble in the target chat interface is determined according to the first target color value adjacent to the color value of the head portrait from the preset target color values, so that the color which is more suitable for the chat bubble can be selected on the basis of the color of the head portrait of the user, and the chat experience of the user can be improved.
In one example, the preset plurality of target color values include a preset plurality of first class target color values, where saturation values corresponding to the plurality of first class target color values are not 0; the determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values comprises the following steps: and determining the adjacent color value of the head portrait in the anticlockwise direction as a first target color value from a hue circle formed by the plurality of first target color values under the condition that the saturation value corresponding to the color value of the head portrait does not belong to a preset interval, wherein the left boundary value of the preset interval is 0. In this example, the plurality of first class target color values may each be a color value. For example, the number of first class target color values may be 36, i.e., the first class target color values may form a 36-color ring. Of course, the number of the first class target color values can be flexibly set by a person skilled in the art according to the actual application scene requirement, and the method is not limited herein. For example, the number of first class target color values may also be 24, 48, etc. In this example, in a case where a saturation value corresponding to a color value of the head portrait does not belong to a preset interval, it may be determined that the head portrait has a color tendency; and under the condition that the saturation value corresponding to the color value of the head portrait belongs to a preset interval, determining that the head portrait does not have color tendency. For example, the preset interval may be [0,0]. Of course, a person skilled in the art can flexibly set the right boundary value of the preset interval according to the actual application scene requirement. For example, the right boundary value of the preset interval may be slightly greater than 0. In this example, in the case where the avatar has a color tendency, a color value close to the color value of the avatar may be selected from a plurality of preset color values as the first target color value, and thus a color more suitable for chat bubbles may be selected, and the color value of the target chat bubble thus determined may reflect the color tendency of the avatar.
In another example, the preset plurality of target color values include a preset plurality of first class target color values, where saturation values corresponding to the plurality of first class target color values are not 0; the determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values comprises the following steps: and determining the adjacent color value in the clockwise direction of the color value of the head portrait as a first target color value from a hue circle formed by a plurality of first target color values under the condition that the saturation value corresponding to the color value of the head portrait does not belong to a preset interval, wherein the left boundary value of the preset interval is 0.
In another example, the preset plurality of target color values includes a preset plurality of second class target color values, wherein saturation values of the plurality of second class target color values are all 0; the determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values comprises the following steps: and under the condition that the saturation value corresponding to the color value of the head portrait belongs to a preset interval, determining the color value closest to the color value of the head portrait in the plurality of second class target color values as a first target color value, wherein the left boundary value of the preset interval is 0. In this example, the plurality of second class target color values may each be a gray value. The number of second class target color values may be two or more. For example, the plurality of second class target color values includes #f7f7fa and #333333. In this example, in a case where a saturation value corresponding to a color value of the head portrait belongs to a preset section, it may be determined that the head portrait does not have a color tendency. In this example, in the case where the avatar does not have a color tendency, a gray value close to the color value of the avatar may be selected from a plurality of preset gray values as the first target color value, and thus a color more suitable for chat bubbles may be selected, and the color value of the target chat bubble thus determined may reflect the gray of the avatar.
In one example, the determining the color value of the target chat bubble in the target chat interface according to the first target color value includes: adjusting the first target color value to obtain a second target color value; and determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value. In this example, the first target color value may be adjusted based on an HSB (Hue-Saturation-Brightness) color mode or an RGB (Red-Green-Blue) color mode, or the like, resulting in the second target color value. For example, at least one of hue, saturation, and brightness of the first target color value may be adjusted based on the HSB color mode to obtain the second target color value. For another example, at least one of the red channel value, the green channel value, and the blue channel value of the first target color value may be adjusted based on the RGB color pattern to obtain the second target color value. Wherein the second target color value is different from the first target color value. By employing this example, the target chat bubble in the target chat interface is thereby made to achieve an effect of gradual color change. Wherein, different target chat bubbles can have different colors, so that the effect of gradual change of color can be realized through a plurality of target chat bubbles in the target chat interface, and the effect of gradual change of color can also be realized in a single target chat bubble.
In one example, the adjusting the first target color value to obtain a second target color value includes: subtracting a preset value from the brightness of the first target color value to obtain a second target color value. For example, the preset value may be 20. Of course, the size of the preset value can be flexibly set by a person skilled in the art according to the actual application scene requirement, and the method is not limited herein. For example, if the first target color value is an RGB value, the first target color value may be converted to an HSB color mode to obtain a first HSB value, the brightness of the first HSB value is subtracted by 20 to obtain a second HSB value, and then the second HSB value is converted back to the RGB color mode to obtain a second target color value. In this example, the brightness of the first target color value is subtracted by a preset value to obtain a second target color value, and the color value of the target chat bubble in the target chat interface is determined according to the first target color value and the second target color value, so that the target chat bubble in the target chat interface can realize the effect of gradual change from bright to dark.
In one example, the determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value includes: acquiring the position information of the target chat bubble in the target chat interface; and determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information, wherein the color value of the target chat bubble is between the first target color value and the second target color value. The position information of the target chat bubble in the target chat interface may include position information of any point or any side of the target chat bubble in the target chat interface. For example, the location information of the target chat bubble in the target chat interface may include location information of at least one of a geometric center, an upper boundary, a lower boundary, etc. of the target chat bubble in the target chat interface. According to this example, a more natural gradient color effect can be achieved for the target chat bubble in the target chat interface.
In one example, the location information includes first location information of an upper boundary of the target chat bubble in the target chat interface and second location information of a lower boundary of the target chat bubble in the target chat interface; the color value of the target chat bubble comprises a starting value and a terminating value of the gradient color of the target chat bubble; the determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information comprises the following steps: determining an initial value of the gradual change of the target chat bubble according to the first target color value, the second target color value and the first position information; and determining a termination value of the gradient color of the target chat bubble according to the first target color value, the second target color value and the first position information. For example, a color gradient from the second target color value to the first target color value from bottom to top may be implemented in the target chat interface. For another example, a color gradient from the first target color value to the second target color value from bottom to top may be implemented in the target chat interface.
The first location information may be an ordinate of an upper boundary of the target chat bubble, or may be a distance between the upper boundary of the target chat bubble and an upper boundary of the target chat interface, or may be a distance between the upper boundary of the target chat bubble and a lower boundary of the target chat interface, or the like. The second location information may be an ordinate of a lower boundary of the target chat bubble, or may be a distance between the lower boundary of the target chat bubble and an upper boundary of the target chat interface, or may be a distance between the lower boundary of the target chat bubble and the lower boundary of the target chat interface, or the like.
For example, UITableView component implementations may be employedThe function of the chat interface is now presented. The method may be recalled from the scrillviewdscrill in response to the target chat interface sliding (e.g., new chat bubbles appear in the target chat interface or the user sliding the target chat interface). For example, the target chat interface includes N target chat bubbles. Then, for any target chat bubble, the distance h between the upper boundary of the target chat bubble and the upper boundary of the target chat interface can be determined 1 And can determine the distance h between the lower boundary of the target chat bubble and the upper boundary of the target chat interface 2 . For example, the height of the target chat interface is h 0 The first target color value is C 1 The second target color value is C 2 ,C 2 -C 1 =C 0 Then C can be 1 +C 0 ×h 1 /h 0 Determining the initial value of the gradient color of the target chat bubble, and C can be determined as 1 +C 0 ×h 2 /h 0 And determining the ending value of the gradient color of the target chat bubble.
For example, for any target chat bubble, a gradual layer corresponding to the target chat bubble may be created using CAGradientLayer provided by the QuartzCore library. The color attribute of the gradient layer may be set to a first target color value and a second target color value. The locations attribute of the gradient layer may be set to an array of 0 and 1, indicating a change from the beginning to the end of the gradient layer. The startPoint attribute of the graded layer may be set to (0.5, 0), indicating that the grading starts from the upper edge of the graded layer; the endPoint attribute of the graded layer may be set to (0.5, 1) indicating that the graded layer has ended graded to its lower edge. The frame value of the gradient layer may be set to the size of the chat bubble.
In the above example, by determining the initial value of the gradual change of the target chat bubble according to the first target color value, the second target color value and the first position information of the upper boundary of the target chat bubble in the target chat interface, and determining the final value of the gradual change of the target chat bubble according to the first target color value, the second target color value and the second position information of the lower boundary of the target chat bubble in the target chat interface, the gradual change effect can be realized in each target chat bubble of the target chat interface, and different colors can be presented along with the change of the position of the target chat bubble in the target chat interface.
Fig. 2 is a schematic diagram of a layer of chat bubbles in a chat bubble display control method according to an embodiment of the disclosure. As shown in fig. 2, the chat bubble may include a main layer 21, a gradient layer 22, and a text layer 23. In one example, the gradient layer 22 of chat bubbles may be added to the main layer 21 of chat bubbles by the addSublayer method of CALayer, and the gradient layer 22 is placed below the literal layer 23. The graded layer 22 and the main layer 21 may have the same size.
In another example, the determining the color value of the target chat bubble in the target chat interface according to the first target color value includes: and determining the first target color value as the color value of the target chat bubble. In this example, the target chat bubble may not be a gradient color. For example, when the saturation value corresponding to the color value of the avatar belongs to a preset interval, that is, when the avatar does not have a color tendency, the first target color value may be determined as the color value of the target chat bubble, so that the color value of the target chat bubble may be quickly determined.
As another example of this implementation, the color value of the target chat bubble includes a color value of a background of the target chat bubble and a color value of a character in the target chat bubble; the determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait comprises the following steps: determining the color value of the background of the target chat bubble in the target chat interface according to the color value of the head portrait; and determining the color value of the character in the target chat bubble according to the color value of the background of the target chat bubble. In this example, by determining the color value of the background of the target chat bubble in the target chat interface from the color value of the avatar, the color of the background of the chat bubble can be controlled based on the color of the avatar of the user. The color of the background of the chat bubble can be changed along with the replacement of the head portrait of the user, so that the background color of the chat bubble which is consistent with the color of the head portrait of the user can be obtained, the background of the chat bubble can be more coordinated with the head portrait of the user, and the chat interface can be richer. In addition, in this example, by determining the color value of the character in the target chat bubble from the color value of the background of the target chat bubble, the color of the character of the chat bubble can be more coordinated with the background of the chat bubble, and the character in the chat bubble can be made clearer.
As another example of this implementation, the color value of the target chat bubble includes a color value of a background of the target chat bubble and a color value of a character in the target chat bubble; the determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait comprises the following steps: determining the color value of the character in the target chat bubble in the target chat interface according to the color value of the head portrait; and determining the color value of the background of the target chat bubble according to the color value of the character in the target chat bubble.
As another example of this implementation, the determining the color value of the target chat bubble in the target chat interface according to the color value of the avatar includes: and determining the color value of the head portrait as the color value of the target chat bubble in the target chat interface.
As another example of this implementation, the color value of the avatar may be determined if the avatar is the avatar customized by the first user, and the color value of the target chat bubble in the target chat interface may be determined according to the color value of the avatar. In the case where the avatar is a default avatar (i.e., the avatar is not a first user-defined avatar), the color value of the target chat bubble may be determined to be a default color value. For example, the default color value may be #f7f7fa.
In another possible implementation, the style information includes a shape of the target chat bubble; the determining the style information of the target chat bubble in the target chat interface according to the head portrait comprises the following steps: and determining the shape of the target chat bubble in the target chat interface according to the head portrait. For example, the avatar may be input into a pre-trained neural network, a style of the avatar may be determined through the neural network, and a shape of a target chat bubble corresponding to the style of the avatar may be determined according to a preset first correspondence, where the first correspondence represents a correspondence between the style of the avatar and the shape of the chat bubble.
In another possible implementation, the style information includes a pattern of the target chat bubble; the determining the style information of the target chat bubble in the target chat interface according to the head portrait comprises the following steps: and determining the pattern of the target chat bubble in the target chat interface according to the head portrait. For example, the avatar may be input into a pre-trained neural network, a style of the avatar may be determined through the neural network, and a pattern of the target chat bubble corresponding to the style of the avatar may be determined according to a preset second correspondence, where the second correspondence represents a correspondence between the style of the avatar and the pattern of the chat bubble.
In one possible implementation manner, the determining style information of the target chat bubble in the target chat interface according to the avatar includes: and determining style information of the target chat bubble in the target chat interface according to the head portrait and preset festival activity information. For example, the preset festival activity information may be christmas, american cups, etc., whereby the atmosphere of the festival and/or the activity can be enhanced.
In an embodiment of the present disclosure, the method for controlling display of chat bubbles may be used to control display of target chat bubbles in a target chat interface, where the target chat interface may include a home end user and a peer end user, that is, a user involved in the target chat interface may include the home end user and the peer end user, where the number of peer end users may be one or more than two. The execution main body of the chat bubble display control method can comprise at least one of a server, terminal equipment corresponding to a local end user and terminal equipment corresponding to an opposite end user. The terminal device corresponding to the home terminal user may represent a terminal device capable of being operated by the home terminal user, and the terminal device corresponding to the peer terminal user may represent a terminal device capable of being operated by the peer terminal user.
In one possible implementation, steps S11 to S13 may be performed by the server side. In this implementation, step S13 may include: and the server generates control information for controlling and displaying the target chat bubbles according to the style information, and sends the control information to terminal equipment corresponding to the local terminal user. And the terminal equipment corresponding to the local terminal user can display the target chat bubble in the target chat interface according to the control information.
In another possible implementation manner, the steps S11 to S13 may be performed by a terminal device corresponding to the home terminal user.
In another possible implementation manner, the server may acquire the avatar of the first user, determine the color value of the avatar, and send the color value of the avatar to the terminal device corresponding to the local user. The terminal equipment corresponding to the local terminal user can determine the color value of the target chat bubble in the target chat interface according to the color value of the head portrait, and control and display the target chat bubble according to the color value of the target chat bubble.
In another possible implementation manner, the terminal device corresponding to the opposite terminal user may acquire the head portrait of the first user, determine the color value of the head portrait, and send the color value of the head portrait to the terminal device corresponding to the local terminal user. The terminal equipment corresponding to the local terminal user can determine the color value of the target chat bubble in the target chat interface according to the color value of the head portrait, and control and display the target chat bubble according to the color value of the target chat bubble.
It should be noted that, although the above four implementations describe the execution bodies of several steps in the embodiments of the present disclosure, those skilled in the art can understand that the present disclosure should not be limited thereto. Those skilled in the art may flexibly select the execution subject of each step according to the actual application scene requirement and/or personal preference, which is not limited herein.
The chat bubble display control method provided by the embodiment of the disclosure is described below through four specific application scenarios.
Application scenario one: the target chat interface includes user a and user B. Wherein, the user A is the opposite end user of the target chat interface, and the user B is the home end user of the target chat interface. The head portrait of the user A can be obtained, the style information B of chat bubbles corresponding to the message sent by the user B in the target chat interface is determined according to the head portrait of the user A, and the chat bubbles corresponding to the message sent by the user B in the target chat interface can be controlled and displayed according to the style information B. And chat bubbles corresponding to the message sent by the user A in the target chat interface can be controlled and displayed according to the default style information. For example, the color value B of the chat bubble corresponding to the message sent by the user B in the target chat interface may be determined according to the color value of the avatar of the user a, and the chat bubble corresponding to the message sent by the user B in the target chat interface may be controlled to be displayed according to the color value B. Chat bubbles corresponding to the message sent by the user a in the target chat interface can also be controlled to be displayed according to a default color value (for example, light gray).
Fig. 3 illustrates a schematic diagram of a chat bubble display control method according to an embodiment of the disclosure.
The target chat interface on the left side of fig. 3 includes a user a and a user B, where the user a is a peer user of the target chat interface, and the user B is a home user of the target chat interface. The head portrait of the user A can be obtained, and the head portrait of the user A can be subjected to fuzzy processing to obtain the head portrait after the fuzzy processing. The color value of the head portrait of the user a can be determined from the blurred head portrait. Here, the H value (hue value) of the color value of the head portrait of the user a is 200, the S value (saturation value) is 90, and the B value (brightness value) is 65. Since the saturation value of the color value of the head portrait of the user a is not 0, the adjacent color value in the counterclockwise direction of the color value of the head portrait of the user a can be determined as the first target color value from the hue circle composed of the plurality of first target color values. The first target color value may be converted to an HSB color mode to obtain a first HSB value, where the first HSB value has an H value of 220, an S value of 100, and a B value of 100. Subtracting 20 from the S value of the first HSB value to obtain a second HSB value, wherein the H value of the second HSB value is 220, the S value is 80, The value of B is 100. And converting the two HSB values back to the RGB color mode to obtain a second target color value. For any target chat bubble corresponding to the message sent by the user B in the target chat interface, C can be set 1 +C 0 ×h 1 /h 0 Determining the initial value of the gradual change of the target chat bubble, and C can be determined 1 +C 0 ×h 2 /h 0 Determining a final value of the gradient color of the target chat bubble, wherein C 1 Representing the first target color value, C 0 Representing the difference between the second target color value and the first target color value, h 0 Indicating the height of the target chat interface, h 1 Indicating the distance between the upper boundary of the target chat bubble and the upper boundary of the target chat interface, h 2 Representing the distance between the lower boundary of the target chat bubble and the upper boundary of the target chat interface.
In the middle target chat interface of fig. 3, the target chat interface includes a user a and a user B, where the user a is a peer user of the target chat interface, and the user B is a home user of the target chat interface. The head portrait of the user A can be obtained, and the head portrait of the user A can be subjected to fuzzy processing to obtain the head portrait after the fuzzy processing. The color value of the head portrait of the user a can be determined from the blurred head portrait. The color value of the head portrait of the user a has an H value of 265, an S value of 75, and a B value of 70. Since the saturation value of the color value of the head portrait of the user a is not 0, the adjacent color value in the counterclockwise direction of the color value of the head portrait of the user a can be determined as the first target color value from the hue circle composed of the plurality of first target color values. The first target color value may be converted to an HSB color mode to obtain a first HSB value, where the first HSB value has an H value of 260, an S value of 90, and a B value of 100. Subtracting 20 from the S value of the first HSB value, a second HSB value is obtained, wherein the H value of the second HSB value is 260, the S value is 70 and the B value is 100. And converting the two HSB values back to the RGB color mode to obtain a second target color value. For any target chat bubble corresponding to the message sent by the user B in the target chat interface, C can be set 1 +C 0 ×h 1 /h 0 Determining the initial value of the gradual change of the target chat bubble, and C can be determined 1 +C 0 ×h 2 /h 0 Determining a final value of the gradient color of the target chat bubble, wherein C 1 Representing the first target color value, C 0 Representing the difference between the second target color value and the first target color value, h 0 Indicating the height of the target chat interface, h 1 Indicating the distance between the upper boundary of the target chat bubble and the upper boundary of the target chat interface, h 2 Representing the distance between the lower boundary of the target chat bubble and the upper boundary of the target chat interface.
The target chat interface on the right side of fig. 3 includes a user a and a user B, where the user a is a peer user of the target chat interface, and the user B is a home user of the target chat interface. The head portrait of the user A can be obtained, and the head portrait of the user A can be subjected to fuzzy processing to obtain the head portrait after the fuzzy processing. The color value of the head portrait of the user a can be determined from the blurred head portrait. The color value of the head portrait of the user a has an H value of 7, an S value of 70, and a B value of 70. Since the saturation value of the color value of the head portrait of the user a is not 0, the adjacent color value in the counterclockwise direction of the color value of the head portrait of the user a can be determined as the first target color value from the hue circle composed of the plurality of first target color values. The first target color value may be converted to an HSB color mode to obtain a first HSB value, where the first HSB value has an H value of 20, an S value of 100, and a B value of 100. Subtracting 20 from the S value of the first HSB value, a second HSB value is obtained, wherein the H value of the second HSB value is 20, the S value is 80 and the B value is 100. And converting the two HSB values back to the RGB color mode to obtain a second target color value. For any target chat bubble corresponding to the message sent by the user B in the target chat interface, C can be set 1 +C 0 ×h 1 /h 0 Determining the initial value of the gradual change of the target chat bubble, and C can be determined 1 +C 0 ×h 2 /h 0 Determining a final value of the gradient color of the target chat bubble, wherein C 1 Representing the first target color value, C 0 Representing the difference between the second target color value and the first target color value, h 0 Indicating the height of the target chat interface, h 1 Representing the upper boundary of the target chat bubble and the target chat interfaceThe distance between the upper boundaries of h 2 Representing the distance between the lower boundary of the target chat bubble and the upper boundary of the target chat interface.
Fig. 4 illustrates another schematic diagram of a chat bubble display control method provided by an embodiment of the present disclosure.
The target chat interface on the left side of fig. 4 includes a user a and a user B, where the user a is a peer user of the target chat interface, and the user B is a home user of the target chat interface. The avatar of user a may be acquired and the color value of the avatar of user a determined. In the case where the saturation value of the color value of the avatar of the user a is 0, from the two second class target color values #f7f7fa and #333333, the #f7f7fa adjacent to the color value of the avatar of the user a may be determined as the first target color value, and the #f7f7fa may be determined as the color value of the chat bubble corresponding to the message sent by the user B in the target chat interface.
The target chat interface on the right side of fig. 4 includes a user a and a user B, where the user a is a peer user of the target chat interface, and the user B is a home user of the target chat interface. The avatar of user a may be acquired and the color value of the avatar of user a determined. In the case where the saturation value of the color value of the avatar of the user a is 0, from among the two second class target color values #f7f7fa and #333333, #333333 adjacent to the color value of the avatar of the user a may be determined as the first target color value, and #333333 may be determined as the color value of the chat bubble corresponding to the message sent by the user B in the target chat interface.
And (2) an application scene II: the target chat interface includes user a and user B. Wherein, the user A is the opposite end user of the target chat interface, and the user B is the home end user of the target chat interface. The head portrait of the user A can be obtained, the style information a of chat bubbles corresponding to the message sent by the user A in the target chat interface is determined according to the head portrait of the user A, and the chat bubbles corresponding to the message sent by the user A in the target chat interface can be controlled and displayed according to the style information a. And chat bubbles corresponding to the message sent by the user B in the target chat interface can be controlled and displayed according to the default style information. For example, the color value a of the chat bubble corresponding to the message sent by the user a in the target chat interface may be determined according to the color value of the avatar of the user a, and the chat bubble corresponding to the message sent by the user a in the target chat interface may be controlled to be displayed according to the color value a. And controlling and displaying chat bubbles corresponding to the message sent by the user B in the target chat interface according to the default color value.
And (3) an application scene III: the target chat interface includes user a and user B. Wherein, the user A is the opposite end user of the target chat interface, and the user B is the home end user of the target chat interface. The head portrait of the user A can be obtained, and the style information a of chat bubbles corresponding to the message sent by the user A in the target chat interface is determined according to the head portrait of the user A; the head portrait of the user B can be obtained, and the style information B of chat bubbles corresponding to the message sent by the user B in the target chat interface is determined according to the head portrait of the user B. Chat bubbles corresponding to the message sent by the user A in the target chat interface can be controlled and displayed according to the style information a; chat bubbles corresponding to the message sent by the user B in the target chat interface can be controlled and displayed according to the style information B. For example, the color value a of the chat bubble corresponding to the message sent by the user a in the target chat interface can be determined according to the color value of the avatar of the user a; the color value B of the chat bubble corresponding to the message sent by the user B in the target chat interface can be determined according to the color value of the head portrait of the user B. Chat bubbles corresponding to the message sent by the user A in the target chat interface can be controlled and displayed according to the color value a; and controlling and displaying chat bubbles corresponding to the message sent by the user B in the target chat interface according to the color value B.
And application scene IV: the target chat interface includes user A 1 User A 2 And user B. Wherein user A 1 And user A 2 The user B is the local end user of the target chat interface. Can obtain user A 1 According to user A 1 Determining user A in a target chat interface 1 Style information a of chat bubble corresponding to sent message 1 The method comprises the steps of carrying out a first treatment on the surface of the Can obtain user A 2 According to user A 2 Determining user A in a target chat interface 2 Message pairs issuedStyle information a of the chat bubble to be used 2 The method comprises the steps of carrying out a first treatment on the surface of the The head portrait of the user B can be obtained, and the style information B of chat bubbles corresponding to the message sent by the user B in the target chat interface is determined according to the head portrait of the user B. Can be based on style information a 1 User A in control display target chat interface 1 Chat bubbles corresponding to the sent messages; can be based on style information a 2 User A in control display target chat interface 2 Chat bubbles corresponding to the sent messages; chat bubbles corresponding to the message sent by the user B in the target chat interface can be controlled and displayed according to the style information B. For example, it can be based on user A 1 Determining user A in the target chat interface based on the color value of the head portrait of (1) 1 Color value a of chat bubble corresponding to sent message 1 The method comprises the steps of carrying out a first treatment on the surface of the Can be according to user A 2 Determining user A in the target chat interface based on the color value of the head portrait of (1) 2 Color value a of chat bubble corresponding to sent message 2 The method comprises the steps of carrying out a first treatment on the surface of the The color value B of the chat bubble corresponding to the message sent by the user B in the target chat interface can be determined according to the color value of the head portrait of the user B. Can be based on the color value a 1 User A in control display target chat interface 1 Chat bubbles corresponding to the sent messages; can be based on the color value a 2 User A in control display target chat interface 2 Chat bubbles corresponding to the sent messages; and controlling and displaying chat bubbles corresponding to the message sent by the user B in the target chat interface according to the color value B.
It will be appreciated that the above-mentioned method embodiments of the present disclosure may be combined with each other to form a combined embodiment without departing from the principle logic, and are limited to the description of the present disclosure. It will be appreciated by those skilled in the art that in the above-described methods of the embodiments, the particular order of execution of the steps should be determined by their function and possible inherent logic.
In addition, the disclosure further provides a chat bubble display control device, an electronic device, a computer readable storage medium, and a program, where the foregoing may be used to implement any of the chat bubble display control methods provided in the disclosure, and the corresponding technical schemes and technical effects may be referred to the corresponding descriptions of the method parts and are not repeated.
Fig. 5 shows a block diagram of a display control apparatus for chat bubbles provided by an embodiment of the present disclosure. As shown in fig. 5, the chat bubble display control device includes:
an acquiring module 51, configured to acquire an avatar of the first user;
a determining module 52, configured to determine style information of a target chat bubble in a target chat interface according to the avatar, where the target chat interface represents a chat interface including the first user, and the target chat bubble represents a chat bubble corresponding to a message sent by a target user in the target chat interface;
and the display control module 53 is configured to control and display the target chat bubble according to the style information.
In one possible implementation, the first user comprises a peer user of the targeted chat interface.
In one possible implementation, the target user includes a home user of the target chat interface or the first user.
In one possible implementation, the style information includes a color value of the target chat bubble;
the determining module 52 is configured to:
determining a color value of the avatar;
and determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait.
In one possible implementation, the determining module 52 is configured to:
performing fuzzy processing on the head portrait to obtain a head portrait after the fuzzy processing;
and determining the color value of the head portrait according to the head portrait after the blurring processing.
In one possible implementation, the determining module 52 is configured to:
determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value.
In one possible implementation, the determining module 52 is configured to:
adjusting the first target color value to obtain a second target color value;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value.
In one possible implementation, the determining module 52 is configured to:
acquiring the position information of the target chat bubble in the target chat interface;
and determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information, wherein the color value of the target chat bubble is between the first target color value and the second target color value.
In the embodiment of the disclosure, the style information of the target chat bubble in the target chat interface is determined by acquiring the head portrait of the first user, and the target chat bubble is controlled to be displayed according to the style information, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubble represents the chat bubble corresponding to the message sent by the target user in the target chat interface, so that the style of the chat bubble can be controlled to be displayed based on the head portrait of the user, and the flexibility of the display control of the chat bubble is improved. That is, as the user changes the avatar, the chat bubble pattern can be changed accordingly, so that the chat bubble pattern conforming to the style of the user avatar can be obtained, the chat bubble can be more coordinated with the avatar of the user, and the chat interface can be richer. In addition, the embodiment of the disclosure can also realize differentiated and personalized chat bubbles, so that the chat experience of the user can be improved.
In some embodiments, functions or modules included in an apparatus provided by the embodiments of the present disclosure may be used to perform a method described in the foregoing method embodiments, and specific implementation and technical effects of the functions or modules may refer to the descriptions of the foregoing method embodiments, which are not repeated herein for brevity.
The disclosed embodiments also provide a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method. Wherein the computer readable storage medium may be a non-volatile computer readable storage medium or may be a volatile computer readable storage medium.
The disclosed embodiments also propose a computer program comprising computer readable code which, when run in an electronic device, causes a processor in the electronic device to carry out the above method.
Embodiments of the present disclosure also provide a computer program product comprising computer readable code, or a non-transitory computer readable storage medium carrying computer readable code, which when run in an electronic device, causes a processor in the electronic device to perform the above method.
The embodiment of the disclosure also provides an electronic device, including: one or more processors; a memory for storing executable instructions; wherein the one or more processors are configured to invoke the executable instructions stored by the memory to perform the above-described method.
The electronic device may be provided as a terminal, server or other form of device.
Fig. 6 shows a block diagram of an electronic device 800 provided by an embodiment of the present disclosure. For example, electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 6, an electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen between the electronic device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the electronic device 800 is in an operational mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the electronic device 800. For example, the sensor assembly 814 may detect an on/off state of the electronic device 800, a relative positioning of the components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in position of the electronic device 800 or a component of the electronic device 800, the presence or absence of a user's contact with the electronic device 800, an orientation or acceleration/deceleration of the electronic device 800, and a change in temperature of the electronic device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a photosensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge Coupled Device (CCD) image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the electronic device 800 and other devices, either wired or wireless. The electronic device 800 may access a wireless network based on a communication standard, such as a wireless network (Wi-Fi), a second generation mobile communication technology (2G), a third generation mobile communication technology (3G), a fourth generation mobile communication technology (4G)/long term evolution of universal mobile communication technology (LTE), a fifth generation mobile communication technology (5G), or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including computer program instructions executable by processor 820 of electronic device 800 to perform the above-described methods.
Fig. 7 shows a block diagram of an electronic device 1900 provided by an embodiment of the disclosure. For example, electronic device 1900 may be provided as a server. Referring to FIG. 7, electronic device 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, processing component 1922 is configured to execute instructions to perform the methods described above.
Electronic equipment 1900 can further comprise a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. Electronic device 1900 may operate an operating system based on memory 1932, such as the Microsoft Server operating system (Windows Server) TM ) Apple Inc. developed graphical user interface based operating System (Mac OS X TM ) Multi-user multi-process computer operating system (Unix) TM ) Unix-like operating system (Linux) of free and open source code TM ) Unix-like operating system (FreeBSD) with open source code TM ) Or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 1932, including computer program instructions executable by processing component 1922 of electronic device 1900 to perform the methods described above.
The present disclosure may be a system, method, and/or computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: portable computer disks, hard disks, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), static Random Access Memory (SRAM), portable compact disk read-only memory (CD-ROM), digital Versatile Disks (DVD), memory sticks, floppy disks, mechanical coding devices, punch cards or in-groove structures such as punch cards or grooves having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media, as used herein, are not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (e.g., optical pulses through fiber optic cables), or electrical signals transmitted through wires.
The computer readable program instructions described herein may be downloaded from a computer readable storage medium to a respective computing/processing device or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers and/or edge servers. The network interface card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in the respective computing/processing device.
Computer program instructions for performing the operations of the present disclosure can be assembly instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, c++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may be executed entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present disclosure are implemented by personalizing electronic circuitry, such as programmable logic circuitry, field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs), with state information of computer readable program instructions, which can execute the computer readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may be realized in particular by means of hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied as a computer storage medium, and in another alternative embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), or the like.
The foregoing description of the embodiments of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (11)

1. A display control method of chat bubbles, comprising:
acquiring an avatar of a first user;
determining style information of target chat bubbles in a target chat interface according to the head portraits, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target users in the target chat interface;
And controlling and displaying the target chat bubble according to the style information.
2. The method of claim 1, wherein the first user comprises a peer user of the targeted chat interface.
3. The method of claim 2, wherein the target user comprises a home end user of the target chat interface or the first user.
4. A method according to any one of claims 1 to 3, wherein the style information comprises a color value of the target chat bubble;
the determining the style information of the target chat bubble in the target chat interface according to the head portrait comprises the following steps:
determining a color value of the avatar;
and determining the color value of the target chat bubble in the target chat interface according to the color value of the head portrait.
5. The method of claim 4, wherein said determining the color value of the avatar comprises:
performing fuzzy processing on the head portrait to obtain a head portrait after the fuzzy processing;
and determining the color value of the head portrait according to the head portrait after the blurring processing.
6. The method of claim 4 or 5, wherein determining the color value of the target chat bubble in the target chat interface based on the color value of the avatar comprises:
Determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values;
determining the color value of the target chat bubble in the target chat interface according to the first target color value;
the preset target color values comprise preset target color values of a plurality of first types, wherein saturation values corresponding to the target color values of the plurality of first types are not 0; the determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values comprises the following steps: determining adjacent color values in the anticlockwise direction or the clockwise direction of the color value of the head portrait as first target color values in a hue circle formed by the plurality of first target color values under the condition that the saturation value corresponding to the color value of the head portrait does not belong to a preset interval, wherein the left boundary value of the preset interval is 0;
or,
the preset target color values comprise preset target color values of a plurality of second types, wherein the saturation values of the target color values of the plurality of second types are all 0; the determining a first target color value adjacent to the color value of the head portrait from a plurality of preset target color values comprises the following steps: and under the condition that the saturation value corresponding to the color value of the head portrait belongs to a preset interval, determining the color value closest to the color value of the head portrait in the plurality of second class target color values as a first target color value, wherein the left boundary value of the preset interval is 0.
7. The method of claim 6, wherein determining the color value of the target chat bubble in the target chat interface based on the first target color value comprises:
adjusting the first target color value to obtain a second target color value;
and determining the color value of the target chat bubble in the target chat interface according to the first target color value and the second target color value.
8. The method of claim 7, wherein determining the color value of the target chat bubble in the target chat interface based on the first target color value and the second target color value comprises:
acquiring the position information of the target chat bubble in the target chat interface;
and determining the color value of the target chat bubble according to the first target color value, the second target color value and the position information, wherein the color value of the target chat bubble is between the first target color value and the second target color value.
9. A display control device for chat bubbles, comprising:
the acquisition module is used for acquiring the head portrait of the first user;
the determining module is used for determining style information of target chat bubbles in a target chat interface according to the head portrait, wherein the target chat interface represents the chat interface comprising the first user, and the target chat bubbles represent chat bubbles corresponding to messages sent by the target user in the target chat interface;
And the display control module is used for controlling and displaying the target chat bubbles according to the style information.
10. An electronic device, comprising:
one or more processors;
a memory for storing executable instructions;
wherein the one or more processors are configured to invoke the memory-stored executable instructions to perform the method of any of claims 1 to 8.
11. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1 to 8.
CN202110858048.6A 2021-07-28 2021-07-28 Chat bubble display control method and device, electronic equipment and storage medium Active CN113535310B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110858048.6A CN113535310B (en) 2021-07-28 2021-07-28 Chat bubble display control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110858048.6A CN113535310B (en) 2021-07-28 2021-07-28 Chat bubble display control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113535310A CN113535310A (en) 2021-10-22
CN113535310B true CN113535310B (en) 2024-04-09

Family

ID=78121252

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110858048.6A Active CN113535310B (en) 2021-07-28 2021-07-28 Chat bubble display control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113535310B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104317869A (en) * 2014-10-17 2015-01-28 小米科技有限责任公司 Instant-messaging-based background setting method and device
FR3026880A1 (en) * 2014-10-01 2016-04-08 Le Tiroir Jaune METHOD FOR PRODUCING PHOTOGRAPHIC RECITS
CN106095403A (en) * 2016-05-30 2016-11-09 努比亚技术有限公司 The exhibiting device of chat message and method
CN112667118A (en) * 2020-12-29 2021-04-16 上海掌门科技有限公司 Method, apparatus and computer readable medium for displaying historical chat messages
CN112866084A (en) * 2020-12-31 2021-05-28 上海掌门科技有限公司 Virtual resource processing method, equipment and computer readable medium for chat group

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965132B2 (en) * 2007-06-08 2018-05-08 Apple Inc. Presenting text messages

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3026880A1 (en) * 2014-10-01 2016-04-08 Le Tiroir Jaune METHOD FOR PRODUCING PHOTOGRAPHIC RECITS
CN104317869A (en) * 2014-10-17 2015-01-28 小米科技有限责任公司 Instant-messaging-based background setting method and device
CN106095403A (en) * 2016-05-30 2016-11-09 努比亚技术有限公司 The exhibiting device of chat message and method
CN112667118A (en) * 2020-12-29 2021-04-16 上海掌门科技有限公司 Method, apparatus and computer readable medium for displaying historical chat messages
CN112866084A (en) * 2020-12-31 2021-05-28 上海掌门科技有限公司 Virtual resource processing method, equipment and computer readable medium for chat group

Also Published As

Publication number Publication date
CN113535310A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
US10152207B2 (en) Method and device for changing emoticons in a chat interface
EP3125547A1 (en) Method and device for switching color gamut mode
CN112801916A (en) Image processing method and device, electronic equipment and storage medium
CN104317869B (en) Background setting method and device based on instant messaging
CN112767285A (en) Image processing method and device, electronic equipment and storage medium
US10963149B2 (en) Parameter adjustment method, apparatus and storage medium
CN110944230B (en) Video special effect adding method and device, electronic equipment and storage medium
CN110619610B (en) Image processing method and device
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
CN108122195B (en) Picture processing method and device
CN107967459B (en) Convolution processing method, convolution processing device and storage medium
CN111815750A (en) Method and device for polishing image, electronic equipment and storage medium
US20140235295A1 (en) Incoming call processing method of mobile terminal, mobile terminal and storage medium
CN113194254A (en) Image shooting method and device, electronic equipment and storage medium
CN110415258B (en) Image processing method and device, electronic equipment and storage medium
CN105677352B (en) Method and device for setting application icon color
CN107527072B (en) Method and device for determining similar head portrait and electronic equipment
CN111935418B (en) Video processing method and device, electronic equipment and storage medium
CN113450431B (en) Virtual hair dyeing method, device, electronic equipment and storage medium
CN111861942A (en) Noise reduction method and device, electronic equipment and storage medium
CN112669233A (en) Image processing method, image processing apparatus, electronic device, storage medium, and program product
CN113535310B (en) Chat bubble display control method and device, electronic equipment and storage medium
WO2014127652A1 (en) Incoming call processing method of mobile terminal, mobile terminal and storage medium
CN111583144B (en) Image noise reduction method and device, electronic equipment and storage medium
CN113763287A (en) Image processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant