WO2022042774A1 - Procédé d'affichage d'image de profil et dispositif électronique - Google Patents

Procédé d'affichage d'image de profil et dispositif électronique Download PDF

Info

Publication number
WO2022042774A1
WO2022042774A1 PCT/CN2021/125916 CN2021125916W WO2022042774A1 WO 2022042774 A1 WO2022042774 A1 WO 2022042774A1 CN 2021125916 W CN2021125916 W CN 2021125916W WO 2022042774 A1 WO2022042774 A1 WO 2022042774A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
chat group
avatar
display area
activity
Prior art date
Application number
PCT/CN2021/125916
Other languages
English (en)
Chinese (zh)
Inventor
王龙
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2022042774A1 publication Critical patent/WO2022042774A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]

Definitions

  • the present application relates to the technical field of electronic devices, and in particular, to an avatar display method and electronic device.
  • the instant messaging tool supports the establishment of a chat group, and multiple users in the chat group can conduct multi-person chat in the chat group.
  • a chat group in an instant messaging tool is usually displayed as a conversation list including avatars and texts, and the avatar of the chat group is composed of user avatars of users in the chat group.
  • the conversation list in the instant messenger includes the conversation list of the chat group and the conversation list of the individual chat.
  • the avatar of the chat group can only be used to quickly distinguish the chat group from the individual chat, and the avatar of the chat group cannot be used for quick identification between multiple chat groups. In this way, the recognition efficiency between chat groups is low, and the user experience is poor.
  • the present application provides an avatar display method and electronic device, which improve the identification efficiency between chat groups and improve user experience.
  • the present application provides a method for displaying an avatar, including: acquiring the activity level of a first user in a chat group in an instant messaging tool in the chat group; and determining the first user's activity level according to the activity level.
  • the size of the display area of the user avatar of a user in the chat group; the user avatar of the first user in the chat group is displayed according to the size of the display area.
  • the activity degree of the user in the chat group is obtained, and according to the activity degree, the display area size of the user's avatar in the chat group is determined, thereby realizing intelligent adjustment of the display area size of the user avatar in the chat group.
  • the size of the display area of the user avatars determined according to the activity in different chat groups is different, and the avatars of the chat groups are quite different. Users can use the avatars of the chat groups. Quickly identify the chat group you are looking for. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • the activity of the first user in the chat group includes the frequency of the first user sending messages in the chat group and/or the interaction frequency of the first user and the second user in the chat group, wherein the second user is an application user of the instant messaging tool.
  • the determining the size of the display area of the user avatar of the first user in the chat group according to the activity level includes: when When the activity of the first user in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest.
  • the determining the size of the display area of the user avatar of the first user in the chat group according to the activity degree further includes: When the activity of the first user in the chat group is not the highest, determine that the display area of the user avatar of the first user in the chat group and the third user in the chat group The display area of the user's avatar is the same size, wherein the third user is another user in the chat group except the user with the highest activity.
  • the determining the size of the display area of the user avatar of the first user in the chat group according to the activity degree further includes: When the activity of the first user in the chat group is not the highest, it is determined that the display area of the user avatar of the first user in the chat group is larger than that of the fourth user in the chat group The display area of the user avatar is large, wherein the fourth user is a user whose activity degree in the chat group is lower than that of the first user in the chat group.
  • the method further includes: determining, according to the activity degree, an arrangement order of the user avatars of the first user in the chat group; According to the arrangement order, the user avatar of the first user in the chat group is displayed.
  • the determining the arrangement order of the user avatars of the first user in the chat group according to the activity level includes: When the activity of the first user in the chat group is the highest, it is determined that the user avatar of the first user in the chat group is arranged in the first order.
  • the The method before determining the size of the display area of the user avatar of the first user in the chat group according to the activity degree, the The method further includes: confirming that the activity of the first user in the chat group is not less than a preset activity threshold.
  • the present application provides an electronic device, comprising: one or more processors and a memory; the memory is coupled to the one or more processors, the memory is used for storing computer program codes, the The computer program code includes computer instructions, the one or more processors execute the computer instructions to perform: obtaining an activity level of a first user in a chat group in an instant messaging tool in the chat group; according to the For the activity degree, the size of the display area of the user avatar of the first user in the chat group is determined; according to the size of the display area, the user avatar of the first user in the chat group is displayed.
  • the activity degree of the user in the chat group is obtained, and according to the activity degree, the display area size of the user's avatar in the chat group is determined, thereby realizing intelligent adjustment of the display area size of the user avatar in the chat group.
  • the size of the display area of the user avatars determined according to the activity in different chat groups is different, and the avatars of the chat groups are quite different. Users can use the avatars of the chat groups. Quickly identify the chat group you are looking for. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • the activity of the first user in the chat group includes the frequency with which the first user sends messages in the chat group and/or the interaction frequency of the first user and the second user in the chat group, wherein the second user is an application user of the instant messaging tool.
  • the processor is configured to determine the size of the display area of the user avatar of the first user in the chat group according to the activity degree , the processor is specifically configured to: when the activity of the first user in the chat group is the highest, determine that the display area of the user avatar of the first user in the chat group is the largest.
  • the processor is configured to determine the size of the display area of the user avatar of the first user in the chat group according to the activity degree , the processor is further configured to: when the activity of the first user in the chat group is not the highest, determine the difference between the display area of the user avatar of the first user in the chat group and the first user in the chat group The display areas of the user avatars of the three users in the chat group are the same size, wherein the third user is another user in the chat group except the user with the highest activity.
  • the processor is configured to determine the size of the display area of the user avatar of the first user in the chat group according to the activity degree , the processor is further configured to: when the activity of the first user in the chat group is not the highest, determine that the display area of the user avatar of the first user in the chat group is smaller than the first user in the chat group The display area of the user avatars of the four users in the chat group is large, wherein the activity of the fourth user in the chat group is lower than that of the first user in the chat group degree users.
  • the processor is further configured to: determine, according to the activity degree, the arrangement of the user avatars of the first user in the chat group order; according to the arrangement order, display the user avatar of the first user in the chat group.
  • the processor when the processor is configured to determine, according to the activity degree, the arrangement order of the user avatars of the first user in the chat group, The processor is specifically configured to: when the activity of the first user in the chat group is the highest, determine that the user avatar of the first user in the chat group is arranged in the first order.
  • the processor is configured to determine, according to the activity degree, before the size of the display area of the user avatar of the first user in the chat group , the processor is further configured to: confirm that the activity of the first user in the chat group is not less than a preset activity threshold.
  • the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the program instructions are executed on an electronic device, the electronic The apparatus performs the method as in the first aspect or any possible implementation of the first aspect.
  • the present application provides a computer program product that, when the computer program product runs on an electronic device, causes the electronic device to perform the method in the first aspect or any possible implementation manner of the first aspect.
  • FIG. 1 is a schematic diagram of a user interface of an instant messaging tool
  • FIG. 2 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a block diagram of a software structure of an electronic device provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a method for displaying an avatar according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a user interface of an instant messaging tool provided by an embodiment of the present application.
  • 6a is a schematic diagram of a user avatar of a user in a chat group provided by an embodiment of the present application
  • 6b is a schematic diagram of another user avatar in a chat group provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another avatar display method provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a user interface of another instant messaging tool provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of another user avatar in a chat group provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another avatar display method provided by an embodiment of the present application.
  • 11a is a schematic diagram of another user avatar in a chat group provided by an embodiment of the present application.
  • FIG. 11b is a schematic diagram of another user profile picture of a user in a chat group provided by an embodiment of the present application.
  • the electronic devices involved in the embodiments of the present application may be mobile phones, tablet computers, desktops, laptops, notebook computers, ultra-mobile personal computers (UMPCs), handheld computers, netbooks, personal digital assistants (personal digital assistants) digital assistant, PDA), wearable electronic devices, virtual reality devices, etc.
  • UMPCs ultra-mobile personal computers
  • PDA personal digital assistants
  • wearable electronic devices virtual reality devices, etc.
  • the "at least one” involved in the embodiments of the present application refers to one or more, and the “plurality” refers to two or more.
  • “And/or”, which describes the association relationship of the associated objects, indicates that there can be three kinds of relationships, for example, A and/or B, which can indicate: the existence of A alone, the existence of A and B at the same time, and the existence of B alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects are an “or” relationship.
  • "At least one item(s) below” or similar expressions thereof refer to any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one item (a) of a, b, or c may represent: a, b, c, ab, ac, bc, or abc, where a, b, and c may be single or multiple .
  • the ordinal numbers such as “first” and “second” mentioned in the embodiments of the present application are used to distinguish multiple objects, and are not used to limit the order, sequence, priority or priority of multiple objects. Importance.
  • the first information and the second information are only for distinguishing different information, and do not indicate the difference in content, priority, transmission order, or importance of the two kinds of information.
  • Instant messaging refers to services that can instantly send and receive Internet messages. Users can send and receive messages through instant messaging tools, and a conversation list including avatars and texts is a common composition pattern in instant messaging tools, and is used to carry each message window.
  • the avatar of the chat group in the instant messaging tool is composed of the user avatars of the users in the chat group, and the composition method of the user avatars in the chat group is usually determined according to the order in which the users in the chat group join.
  • the instant communication tool may be, for example, WeChat and the like.
  • FIG. 1 is a schematic diagram of a user interface of an instant messaging tool.
  • the user interface 100 includes a conversation list of a chat group and a conversation list of an individual chat, wherein the avatar of the chat group is composed of the user avatars of the users in the chat group, and the avatar of the individual chat is composed of the avatars of the users in the chat group.
  • the avatar is the user avatar of the user who chats alone.
  • there are many conversation lists included in the user interface including the conversation lists of chat group 1, chat group 2, and chat group 3, as well as conversation lists with user 1, user 2, user 3, and user 4. List of conversations for individual chats.
  • the avatar of the chat group can only be used to quickly distinguish the chat group from the individual chat, and the avatars of the chat group cannot be used for quick identification among multiple chat groups. In this way, the recognition efficiency between chat groups is low, and the user experience is poor.
  • FIG. 2 shows a schematic structural diagram of an electronic device 200 .
  • the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2 , mobile communication module 250, wireless communication module 260, audio module 270, speaker 270A, receiver 270B, microphone 270C, headphone jack 270D, sensor module 280, buttons 290, motor 291, indicator 292, camera 293, display screen 294, and Subscriber identification module (subscriber identification module, SIM) card interface 295 and so on.
  • SIM Subscriber identification module
  • the sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, an air pressure sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, and ambient light.
  • Sensor 280L Bone Conduction Sensor 280M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 200 .
  • the electronic device 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
  • the processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 210 may contain multiple sets of I2C buses.
  • the processor 210 can be respectively coupled to the touch sensor 280K, the charger, the flash, the camera 293 and the like through different I2C bus interfaces.
  • the processor 210 can couple the touch sensor 280K through the I2C interface, so that the processor 210 communicates with the touch sensor 280K through the I2C bus interface, so as to realize the touch function of the electronic device 200 .
  • the I2S interface can be used for audio communication.
  • the processor 210 may contain multiple sets of I2S buses.
  • the processor 210 may be coupled with the audio module 270 through an I2S bus to implement communication between the processor 210 and the audio module 270.
  • the audio module 270 can transmit audio signals to the wireless communication module 260 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 270 and the wireless communication module 260 may be coupled through a PCM bus interface.
  • the audio module 270 can also transmit audio signals to the wireless communication module 260 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 210 with the wireless communication module 260 .
  • the processor 210 communicates with the Bluetooth module in the wireless communication module 260 through the UART interface to implement the Bluetooth function.
  • the audio module 270 can transmit audio signals to the wireless communication module 260 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 210 with peripheral devices such as the display screen 294 and the camera 293 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 210 communicates with the camera 293 through a CSI interface, so as to implement the photographing function of the electronic device 200 .
  • the processor 210 communicates with the display screen 294 through the DSI interface to implement the display function of the electronic device 200 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 210 with the camera 293, the display screen 294, the wireless communication module 260, the audio module 270, the sensor module 280, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 230 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 230 can be used to connect a charger to charge the electronic device 200, and can also be used to transmit data between the electronic device 200 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 200 .
  • the electronic device 200 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 240 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 240 may receive charging input from the wired charger through the USB interface 230 .
  • the charging management module 240 may receive wireless charging input through the wireless charging coil of the electronic device 200 . While the charging management module 240 charges the battery 242 , the power management module 241 can also supply power to the electronic device.
  • the power management module 241 is used to connect the battery 242 , the charging management module 240 and the processor 210 .
  • the power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 294, the camera 293, and the wireless communication module 260.
  • the power management module 241 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 241 may also be provided in the processor 210 .
  • the power management module 241 and the charging management module 240 may also be provided in the same device.
  • the wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 250 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device 200 .
  • the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 250 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 250 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the processor 210 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the same device as at least part of the modules of the processor 210 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display screen 294 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 210, and may be provided in the same device as the mobile communication module 250 or other functional modules.
  • the wireless communication module 260 can provide applications on the electronic device 200 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 260 receives electromagnetic waves via the antenna 2 , modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 260 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 200 is coupled with the mobile communication module 250, and the antenna 2 is coupled with the wireless communication module 260, so that the electronic device 200 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 200 implements a display function through a GPU, a display screen 294, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 294 is used to display images, videos, and the like.
  • Display screen 294 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 200 may include one or N display screens 294 , where N is a positive integer greater than one.
  • the electronic device 200 can realize the shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294 and the application processor.
  • the ISP is used to process the data fed back by the camera 293 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 293 .
  • Camera 293 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 200 may include 1 or N cameras 293 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 200 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 200 may support one or more video codecs.
  • the electronic device 200 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 200 can be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 200.
  • the external memory card communicates with the processor 210 through the external memory interface 220 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the internal memory 221 may include a storage program area and a storage data area.
  • the program storage area may store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 200 and the like.
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 210 executes various functional applications and data processing of the electronic device 200 by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
  • the electronic device 200 may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
  • the audio module 270 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be provided in the processor 210 , or some functional modules of the audio module 270 may be provided in the processor 210 .
  • Speaker 270A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 200 can listen to music through the speaker 270A, or listen to a hands-free call.
  • the receiver 270B also referred to as an "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 270B close to the human ear.
  • the microphone 270C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 270C through the human mouth, and input the sound signal into the microphone 270C.
  • the electronic device 200 may be provided with at least one microphone 270C. In other embodiments, the electronic device 200 may be provided with two microphones 270C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 200 may further be provided with three, four or more microphones 270C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the headphone jack 270D is used to connect wired headphones.
  • the earphone interface 270D can be a USB interface 230, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 280A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 280A may be provided on the display screen 294 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to pressure sensor 280A, the capacitance between the electrodes changes.
  • the electronic device 200 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 294, the electronic device 200 detects the intensity of the touch operation according to the pressure sensor 280A.
  • the electronic device 200 may also calculate the touched position according to the detection signal of the pressure sensor 280A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 280B may be used to determine the motion attitude of the electronic device 200 .
  • the angular velocity of electronic device 200 about three axes ie, x, y, and z axes
  • the gyro sensor 280B can be used for image stabilization.
  • the gyro sensor 280B detects the shaking angle of the electronic device 200, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 200 through reverse motion to achieve anti-shake.
  • the gyro sensor 280B can also be used for navigation and somatosensory game scenarios.
  • Air pressure sensor 280C is used to measure air pressure. In some embodiments, the electronic device 200 calculates the altitude through the air pressure value measured by the air pressure sensor 280C to assist in positioning and navigation.
  • Magnetic sensor 280D includes a Hall sensor.
  • the electronic device 200 can detect the opening and closing of the flip holster using the magnetic sensor 280D.
  • the electronic device 200 can detect the opening and closing of the flip according to the magnetic sensor 280D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 280E can detect the magnitude of the acceleration of the electronic device 200 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 200 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 200 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 200 can use the distance sensor 280F to measure the distance to achieve fast focusing.
  • Proximity light sensor 280G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 200 emits infrared light to the outside through the light emitting diode.
  • Electronic device 200 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 200 . When insufficient reflected light is detected, the electronic device 200 may determine that there is no object near the electronic device 200 .
  • the electronic device 200 can use the proximity light sensor 280G to detect that the user holds the electronic device 200 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 280G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 280L is used to sense ambient light brightness.
  • the electronic device 200 can adaptively adjust the brightness of the display screen 294 according to the perceived ambient light brightness.
  • the ambient light sensor 280L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 280L can also cooperate with the proximity light sensor 280G to detect whether the electronic device 200 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 280H is used to collect fingerprints.
  • the electronic device 200 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 280J is used to detect the temperature.
  • the electronic device 200 utilizes the temperature detected by the temperature sensor 280J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 280J exceeds a threshold value, the electronic device 200 reduces the performance of the processor located near the temperature sensor 280J in order to reduce power consumption and implement thermal protection.
  • the electronic device 200 heats the battery 242 to avoid abnormal shutdown of the electronic device 200 caused by the low temperature.
  • the electronic device 200 boosts the output voltage of the battery 242 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 280K is also called “touch device”.
  • the touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, also called a "touch screen”.
  • the touch sensor 280K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 294 .
  • the touch sensor 280K may also be disposed on the surface of the electronic device 200 , which is different from the location where the display screen 294 is located.
  • the bone conduction sensor 280M can acquire vibration signals.
  • the bone conduction sensor 280M may acquire vibration signals of the vibrating bone mass of the human voice.
  • the bone conduction sensor 280M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 280M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 270 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 280M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 280M, so as to realize the function of heart rate detection.
  • the keys 290 include a power-on key, a volume key, and the like. Keys 290 may be mechanical keys. It can also be a touch key.
  • the electronic device 200 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 200 .
  • Motor 291 can generate vibrating cues.
  • the motor 291 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 291 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 294 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 292 can be an indicator light, which can be used to indicate the charging status, the change of power, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 295 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295 .
  • the electronic device 200 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card and so on.
  • the same SIM card interface 295 can insert multiple cards at the same time.
  • the types of the plurality of cards may be the same or different.
  • the SIM card interface 295 can also be compatible with different types of SIM cards.
  • the SIM card interface 295 is also compatible with external memory cards.
  • the electronic device 200 interacts with the network through the SIM card to realize functions such as call and data communication.
  • the electronic device 200 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 200 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 200 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present invention takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 200 as an example.
  • FIG. 3 is a block diagram of a software structure of an electronic device 200 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package can include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message and so on.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 200 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, etc.). Raw input events are stored at the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon, for example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • the camera 293 captures still images or video.
  • FIG. 4 is a schematic diagram of a method for displaying an avatar according to an embodiment of the present application.
  • the avatar display method is applied to an electronic device, and the electronic device may be the electronic device 200 shown in FIG. 2 , and the avatar display method includes:
  • the activity of the first user in the chat group includes the frequency of the first user sending messages in the chat group and/or the chatting frequency between the first user and the second user in the chat group.
  • the interaction frequency in the group, wherein the second user is an application user of the instant messaging tool.
  • the electronic device may acquire the activity level of the first user in the chat group within a preset time period in the chat group.
  • the preset time period may be preset, for example, it may be a week or a month, and the electronic device automatically acquires the activity of the first user in the chat group when the preset time period arrives.
  • the first user may be any one or more users in the chat group, including instant messaging The application user of the tool.
  • the activity of the first user in the chat group is the frequency of interaction between the first user and the second user in the chat group
  • the second user is an instant messaging tool application user
  • the first user is the chat group One or more other users in the group other than the second user.
  • S402. Determine, according to the activity degree, the size of the display area of the user avatar of the first user in the chat group.
  • the display area of the user avatar of the first user in the chat group is the largest.
  • the display area of the user avatar of the first user in the chat group is the same as that of the first user in the chat group.
  • the display areas of the user avatars of the three users in the chat group are the same size, wherein the third user is another user in the chat group except the user with the highest activity.
  • the activity of the first user in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest, so that the first user and the chat group can be excluded from the first user.
  • a user is distinguished from other users.
  • the activity level of the first user in the chat group is not the highest, it is determined that the user with the highest activity level has the largest display area of the user avatar in the chat group.
  • the display area of the user avatars in the chat group is the same size, so that the most active users in the chat group can be compared with other users except the first user.
  • the most active user is differentiated from other users.
  • the display area of the user avatar in the chat group is adjusted according to the activity degree, wherein the higher the activity degree, the higher the display area. bigger. In this way, not only the most active users in the chat group can be distinguished from other users except the most active users, but also other users in the chat group except the most active users can be distinguished.
  • the user avatar of the first user in the chat group is displayed. That is, for the application user of the instant communication tool, on the user interface of the instant communication tool, the user with the largest display area of the user's avatar in the chat group is the user with the highest activity in the chat group. For the same chat group, on the user interface of the instant messaging tool of all users in the chat group, the user with the largest display area of the user's avatar in the chat group is the most active user in the chat group. For different chat groups, the most active users may be different. Therefore, on the user interface of the instant messaging tool, the users with the largest display area of user avatars in different chat groups are also different. Therefore, different chat groups can be quickly distinguished.
  • the activity of the first user in the chat group is the frequency with which the first user sends messages in the chat group. That is to say, for the same chat group, on the user interface of the instant messaging tool of all users in the chat group, the user with the largest display area of the user's avatar in the chat group is the one with the highest frequency of sending messages in the chat group. of the same user.
  • the activity of the first user in the chat group is the frequency of interaction between the first user and the second user in the chat group
  • the second user is an instant messaging tool application user. That is, for the same chat group, on the user interface of the instant messaging tool of one of the users in the chat group, the user with the largest display area of the user's avatar in the chat group is the user with the highest interaction frequency with the user.
  • the user with the highest interaction frequency with a user in the chat group may be different, so on the user interface of the instant messaging tool of all users in the chat group, the display area of the user's avatar in the chat group is the largest users may also be different.
  • the activity degree of the user in the chat group is obtained, and according to the activity degree, the display area size of the user's avatar in the chat group is determined, thereby realizing intelligent adjustment of the display area size of the user avatar in the chat group.
  • the size of the display area of the user avatars determined according to the activity in different chat groups is different, and the avatars of the chat groups are quite different. Users can use the avatars of the chat groups. Quickly identify the chat group you are looking for. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • FIG. 5 is a schematic diagram of a user interface of an instant messaging tool provided by an embodiment of the present application.
  • the user interface 500 includes a conversation list of chat groups and a conversation list of individual chats.
  • there are many conversation lists included in the user interface including the conversation lists of chat group 1 and chat group 2, and individual chats with user 1, user 2, user 3, user 4 and user 5 list of sessions.
  • the avatars of the chat group 1 are composed of user avatars of 6 users, and the user avatar of one user has the largest display area, the user avatars of the other five users have the same display area, and the user avatars have the largest display area.
  • the user with the largest display area is the most active user in chat group 1.
  • the avatar of chat group 2 consists of the user avatars of 3 users, and the display area of the user avatar of one user is the largest, and the display area of the user avatars of the other two users is the same size.
  • the user with the largest display area of the user avatar is The most active users in chat group 2.
  • the avatar of the chat group can not only be used to quickly distinguish the chat group from the individual chat, but also the chat group can be used between the chat group 1 and the chat group 2 avatar for quick identification. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • FIG. 6a is a schematic diagram of a user avatar of a user in a chat group provided by an embodiment of the present application
  • FIG. 6b is another user in a chat group provided by an embodiment of the present application.
  • Schematic of the avatar As shown in Figure 6a and Figure 6b, the avatars of the chat group are composed of user avatars of user A, user B, user C, and user D. Among them, user A has the highest activity in the chat group, and user B has the highest activity in the chat group. The activity in the chat group ranks second, and user C and user D have the same activity in the chat group. In Figure 6a, the user A has the largest display area of the user avatar in the chat group.
  • the display area of the user avatar in the chat group is the largest, and the display area of user B's avatar in the chat group is smaller than the display area of user A's avatar in the chat group , but is larger than the display area of the user avatars of user C and user D in the chat group.
  • FIG. 7 is a schematic diagram of another avatar display method provided by an embodiment of the present application.
  • the avatar display method is applied to an electronic device, and the electronic device may be the electronic device 200 shown in FIG. 2 , and the avatar display method includes:
  • the electronic device can acquire the frequency of sending messages in the chat group by the first user in the chat group within a preset time period.
  • the preset time period may be preset, for example, may be a week or a month, and the electronic device automatically obtains the frequency of the first user sending messages in the chat group when the preset time period arrives.
  • the first user may be any one or more users in the chat group, including an application user of an instant messaging tool.
  • S702. Determine the size of the display area of the user avatar of the first user in the chat group according to the frequency of the first user sending messages in the chat group.
  • the frequency of sending messages in the chat group by the first user is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest.
  • the frequency of the first user sending messages in the chat group is not the highest, determine the difference between the display area of the user avatar of the first user in the chat group and the user avatar of the third user in the chat group
  • the display areas are of the same size, and the third user is other users in the chat group except the user with the highest frequency of sending messages.
  • a chat group includes user A, user B, and user C, where user A sends messages the most frequently in the chat group, user B and user C send messages in different frequencies in the chat group, and are lower than User A.
  • User A's user avatar in the chat group has the largest display area. Regardless of whether User B sends messages in the chat group more frequently than User C sends messages in the chat group, User B and User C are chatting.
  • the display area of the user avatars in the group is the same size.
  • the frequency of the first user sending messages in the chat group when the frequency of the first user sending messages in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest.
  • the frequency of the first user sending messages in the chat group is not the highest, it is determined that the display area of the user avatar of the first user in the chat group is larger than the display area of the user avatar of the fourth user in the chat group The area is large, wherein the fourth user is a user whose frequency of sending messages in the chat group is lower than the frequency of sending messages in the chat group by the first user.
  • a chat group includes user A, user B, and user C, where user A sends messages the most frequently in the chat group, and user B and user C send messages less frequently in the chat group than users A, and user B sends messages more frequently than user C in the chat group.
  • the display area of user A's avatar in the chat group is the largest, and the display area of user B's avatar in the chat group is smaller than the display area of user A's avatar in the chat group, but smaller than that of user C in the chat group.
  • the display area of user avatars in the group is large.
  • the frequency of the first user sending messages in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest, so that the first user can be excluded from the chat group.
  • the other users of the first user are distinguished.
  • the display area of the avatars of the users in the chat group is the same size, so that the frequency of sending messages in the chat group is the highest.
  • the display area of the user avatar in the chat group is resized according to the frequency of sending messages, wherein the The higher the frequency, the larger the display area. In this way, not only can the users who send messages most frequently in the chat group be distinguished from other users except the users who send messages the most frequently, but also other users in the chat group except the users who send messages the most frequently can be distinguished between other users in the chat group. differentiate.
  • the user avatar of the first user in the chat group is displayed. That is, for the application user of the instant messaging tool, on the user interface of the instant messaging tool, the user with the largest display area of the user's avatar in the chat group is the user with the highest frequency of sending messages in the chat group.
  • the user with the largest display area of the user's avatar in the chat group is the same user with the highest frequency of sending messages in the chat group .
  • the users with the highest frequency of sending messages may be different. Therefore, on the user interface of instant messaging tools, the users with the largest display area of user avatars in different chat groups are also different, so different chat groups can be quickly distinguished.
  • S704. Determine the arrangement order of the user avatars of the first user in the chat group according to the frequency of the first user sending messages in the chat group.
  • the frequency of sending messages in the chat group by the first user is the highest, it is determined that the user avatars of the first user in the chat group are arranged in the first order.
  • the arrangement order of the avatars of the first user in the chat group and the avatars of the third user in the chat group are not limited. The context of the arrangement order, wherein the third user is other users in the chat group except the user with the highest frequency of sending messages.
  • a chat group includes user A, user B, and user C, where user A sends messages the most frequently in the chat group, user B and user C send messages in different frequencies in the chat group, and are lower than User A.
  • User A's avatars in the chat group are in the first order. Regardless of whether User B sends messages more frequently in the chat group than User C sends messages in the chat group, user B is in the chat group.
  • the arrangement order of the user avatars in the chat group may be earlier or later than the arrangement order of the user C's avatars in the chat group.
  • the frequency of sending messages in the chat group by the first user when the frequency of sending messages in the chat group by the first user is the highest, it is determined that the user avatars of the first user in the chat group are arranged in the first order.
  • the frequency of the first user sending messages in the chat group is not the highest, it is determined that the arrangement order of the avatars of the first user in the chat group is higher than the arrangement order of the avatars of the fourth user in the chat group.
  • a chat group includes user A, user B, and user C, where user A sends messages the most frequently in the chat group, and user B and user C send messages less frequently in the chat group than users A, and user B sends messages more frequently than user C in the chat group.
  • User A's user avatars in the chat group are arranged in the first order, and user B's user avatars in the chat group are arranged in a higher order than user C's user avatars in the chat group.
  • the user avatar of the first user in the chat group is displayed. That is to say, for an application user of an instant messaging tool, on the user interface of the instant messaging tool, the user whose avatar is listed first in the chat group is the user with the highest frequency of sending messages in the chat group. .
  • the user whose avatars in the chat group are arranged in the first order is the same one with the highest frequency of sending messages in the chat group.
  • the users with the highest frequency of sending messages may be different. Therefore, on the user interface of instant messaging tools, the arrangement order of user avatars in different chat groups is also different for the first user, so different chat groups can be quickly distinguished. .
  • the frequency of sending messages in the chat group is judged.
  • the frequency of the first user sending messages in the chat group is not less than the preset frequency threshold, determine the size and/or arrangement order of the display area of the user avatar of the first user in the chat group, and then display the first user's avatar in the chat group.
  • the user's avatar in this chat group When the frequency of the first user sending messages in the chat group is less than the preset frequency threshold, the user avatar of the first user in the chat group is not displayed.
  • the frequency of messages sent by users in the chat group is obtained, and according to the frequency of sending messages, the size and arrangement order of the display area of the user avatars in the chat group are determined, so as to realize intelligent adjustment of the user avatars in the chat group. size and arrangement order of the display area.
  • the size and arrangement order of the user avatars determined according to the frequency of sending messages in different chat groups are different, and the avatars of the chat groups are quite different.
  • the avatar of the chat group can quickly identify the chat group you want to find. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • FIG. 8 is a schematic diagram of a user interface of another instant messaging tool provided by an embodiment of the present application.
  • the user interface 800 includes a conversation list of chat groups and a conversation list of individual chats.
  • there are many conversation lists included in the user interface specifically including the conversation lists of chat group 1 and chat group 2, and individual chats with user 1, user 2, user 3, user 4 and user 5 list of sessions.
  • the avatars of the chat group 1 are composed of user avatars of four users, and the user avatar of one of the users has the largest display area and ranks first.
  • the display areas of the user avatars of the other three users are equal in size, and the user avatars have the largest display area and the top ranked user is the user in the chat group 1 with the highest frequency of sending messages.
  • the avatars of the chat group 2 are composed of user avatars of 6 users, and the user avatar of one of the users has the largest display area and ranks first.
  • the display areas of the user avatars of the two users are of the same size, and they are arranged in the front. Among them, the user avatars of the three users have the smallest display area and are arranged in the last order.
  • the user with the largest display area of the user's avatar and the top ranked user is the user with the highest frequency of sending messages in the chat group 2 .
  • the avatar of the chat group can not only be used to quickly distinguish the chat group from the individual chat, but also the chat group can be used between the chat group 1 and the chat group 2 avatar for quick identification. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • FIG. 9 is a schematic diagram of another user avatar of a user in a chat group provided by an embodiment of the present application.
  • the avatars of the chat group are composed of user avatars of user A, user B, user C, user D, user E and user F, wherein the messages sent by user A and user B in the chat group the highest frequency.
  • the display area of the user avatars of user A and user B in the chat group is larger than the display area of the user avatars of other users in the chat group, and they are in the front position.
  • User C User D .
  • the frequency of sending messages by user E and user F is not exactly the same, the display areas of the user avatars in the chat group are of equal size and are ranked at a lower position.
  • FIG. 10 is a schematic diagram of another avatar display method provided by an embodiment of the present application.
  • the avatar display method is applied to an electronic device, and the electronic device may be the electronic device 200 shown in FIG. 2 , and the avatar display method includes:
  • the second user is an application user of an instant messaging tool.
  • the electronic device can obtain the information about the chat group between the first user and the second user in the chat group within a preset time period.
  • the frequency of interactions in the chat group may be preset, such as a week or a month, and the electronic device automatically acquires the interaction frequency of the first user and the second user in the chat group when the preset time period arrives.
  • the first user is one or more other users in the chat group except the second user.
  • the interaction frequency of the first user and the second user in the chat group includes at least one of the following: the frequency of the second user replying to the message of the first user in the chat group, the frequency of the second user putting the first user in the chat group The frequency with which the sent messages are forwarded outside the chat group, the frequency with which the second user specifies the chatting frequency with the first user within the chat group, and the frequency with which the second user cites the messages sent by the first user within the chat group.
  • the frequency of interaction between the first user and the second user outside the chat group may also be acquired, for example, the frequency of the second user sending messages to the first user alone, and the like.
  • S1002. Determine the size of the display area of the user avatar of the first user in the chat group according to the interaction frequency of the first user and the second user in the chat group.
  • the interaction frequency between the first user and the second user in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest.
  • the interaction frequency of the first user and the second user in the chat group is not the highest, determine the display area of the user avatar of the first user in the chat group and the third user in the chat group
  • the display areas of the user avatars are of the same size, wherein the third user is a user in the chat group except the user with the highest interaction frequency with the second user in the chat group.
  • a chat group includes user A, user B, and user C, wherein user A and user B have the highest interaction frequency in the chat group, and user B and user C have the highest interaction frequency in the chat group.
  • user A when user A is an application user of an instant messaging tool, on the user interface of user A's instant messaging tool, the display area of user B's avatar in the chat group is the largest.
  • user B when user B is an application user of an instant messaging tool, on the user interface of user B's instant messaging tool, the display area of user C's user avatar in the chat group is the largest.
  • the interaction frequency between the first user and the second user in the chat group when the interaction frequency between the first user and the second user in the chat group is the highest, it is determined that the display area of the user avatar of the first user in the chat group is the largest.
  • the interaction frequency between the first user and the second user in the chat group is not the highest, it is determined that the display area of the user avatar of the first user in the chat group is larger than that of the fourth user in the chat group.
  • the display area of the user avatar is large, wherein the fourth user is a user whose interaction frequency with the second user in the chat group is lower than the interaction frequency of the first user with the second user in the chat group.
  • a chat group includes user A, user B, user C, and user D.
  • user A and user B have the highest interaction frequency in the chat group, and user A and user C interact in the chat group.
  • the frequency is higher than the interaction frequency between user A and user D in the chat group.
  • User B and user C have the highest interaction frequency in the chat group, and the interaction frequency of user B and user A in the chat group is higher than the interaction frequency of user B and user D in the chat group.
  • user A when user A is an application user of an instant messaging tool, on the user interface of user A's instant messaging tool, user B's user avatar in the chat group has the largest display area, and user C is in the chat group.
  • the display area of user avatars in the group is larger than the display area of user D's avatars in the chat group.
  • user B when user B is an instant messaging tool application user, on the user interface of user B's instant messaging tool, user C's user avatar in the chat group has the largest display area, and user A is in the chat group.
  • the display area of the user avatar of user D is larger than the display area of the user avatar of user D in the chat group.
  • the user avatar of the first user in the chat group is displayed. That is to say, for the application user of the instant communication tool, on the user interface of the instant communication tool, the user with the largest display area of the user's avatar in the chat group is the user in the chat group with the second user in the chat group. The most engaged users in the group.
  • the user with the largest display area of the user's avatar in the chat group is the user with the highest interaction frequency with the user.
  • the user with the highest interaction frequency with a user in the chat group may be different, so on the user interface of the instant messaging tool of all users in the chat group, the display area of the user's avatar in the chat group is the largest users may also be different.
  • S1004. Determine the arrangement order of the user avatars of the first user in the chat group according to the interaction frequency of the first user and the second user in the chat group.
  • the interaction frequency between the first user and the second user in the chat group is the highest, it is determined that the user avatars of the first user in the chat group are arranged in the first order.
  • the arrangement order of the user avatars of the first user in the chat group and the third user in the chat group are not limited The contextual relationship of the arrangement order of the user avatars, wherein the third user is a user in the chat group except the user with the highest interaction frequency with the second user in the chat group.
  • the interaction frequency between the first user and the second user in the chat group is the highest, it is determined that the user avatars of the first user in the chat group are arranged in the first order .
  • the interaction frequency of the first user and the second user in the chat group is not the highest, it is determined that the arrangement order of the avatars of the first user in the chat group is higher than that of the fourth user in the chat group
  • the arrangement order of the user avatars is higher, wherein the interaction frequency of the fourth user and the second user in the chat group is lower than that of the first user and the second user in the chat group the interaction frequency of users.
  • a chat group includes user A, user B, user C, and user D.
  • user A and user B have the highest interaction frequency in the chat group, and user A and user C interact in the chat group.
  • the frequency is higher than the interaction frequency between user A and user D in the chat group.
  • User B and user C have the highest interaction frequency in the chat group, and the interaction frequency of user B and user A in the chat group is higher than the interaction frequency of user B and user D in the chat group.
  • user A on the user interface of user A's instant messaging tool, user B's avatars in the chat group are arranged in the first order, and user C's user avatars in the chat group are arranged in a higher order than the user's avatars in the chat group.
  • D's avatars in the chat group are listed earlier.
  • user B on the user interface of user B's instant messaging tool, user C's avatars in the chat group are ranked first, and the display area of user A's avatars in the chat group is higher than that of user D.
  • User avatars in chat groups are listed earlier.
  • the user avatar of the first user in the chat group is displayed. That is to say, for an application user of an instant messaging tool, on the user interface of the instant messaging tool, the user whose avatar is ranked first in the chat group is the interaction with the second user in the chat group. most frequent users.
  • the frequency of interaction of the second user in the chat group is judged.
  • the interaction frequency of the first user and the second user in the chat group is not less than the preset frequency threshold, determine the size and/or arrangement order of the display area of the user avatar of the first user in the chat group, and then The user avatar of the first user in the chat group is displayed.
  • the interaction frequency of the first user and the second user in the chat group is less than the preset frequency threshold, the user avatar of the first user in the chat group is not displayed.
  • the interaction frequency between the user in the chat group and the application user of the instant messaging tool in the chat group is obtained, and according to the interaction frequency, the size and arrangement order of the display area of the user's avatar in the chat group are determined, thereby Realize intelligent adjustment of the display area size and arrangement order of user avatars in chat groups.
  • the size and arrangement order of the user avatars determined according to the interaction frequency in different chat groups are different, and the avatars of the chat groups are quite different.
  • the group avatar can quickly identify the chat group you need to find. In this way, the identification efficiency between chat groups is improved, and the user experience is improved.
  • FIG. 11a is a schematic diagram of another user avatar in a chat group provided by an embodiment of the present application
  • FIG. 11b is another user in a chat group provided by an embodiment of the present application.
  • the avatars of the chat group are composed of user avatars of user A, user B, user C, and user D.
  • user A and user B have the highest interaction frequency in the chat group
  • user A has the highest interaction frequency in the chat group
  • the frequency of interaction with user C in the chat group is higher than that of user A and user D in the chat group.
  • User B and user C have the highest interaction frequency in the chat group, and the interaction frequency of user B and user A in the chat group is higher than the interaction frequency of user B and user D in the chat group.
  • user A is the application user of the instant messaging tool, and on the user interface of user A's instant messaging tool, user B's user avatar in the chat group
  • the display area is the largest, and the display area of the avatars of other users in the chat group is the same size.
  • Embodiments of the present application also provide a computer-readable storage medium. All or part of the processes in the above method embodiments may be completed by a computer program instructing relevant hardware, the program may be stored in the above computer storage medium, and when executed, the program may include the processes in the above method embodiments.
  • the computer-readable storage medium includes: read-only memory (ROM) or random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted over a computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state disks (SSDs)), and the like.
  • the modules in the apparatus of the embodiment of the present application may be combined, divided and deleted according to actual needs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention divulgue un procédé d'affichage d'image de profil et un dispositif électronique. Le procédé d'affichage d'image de profil comprend les étapes consistant à : obtenir, dans un groupe de discussion en ligne d'un outil de messagerie instantané, un niveau d'activité d'un premier utilisateur dans le groupe de discussion en ligne ; en fonction du niveau d'activité, déterminer une taille de zone d'affichage d'une image de profil d'utilisateur du premier utilisateur dans le groupe de dialogue en ligne ; en fonction de la taille de la zone d'affichage, afficher l'image de profil d'utilisateur du premier utilisateur dans le groupe de dialogue en ligne. Ainsi, lorsque l'outil de messagerie instantanée comporte de multiples groupes de dialogue en ligne, différents groupes de dialogue en ligne présentent des tailles différentes de zone d'affichage des images de profil d'utilisateur déterminées en fonction des niveaux d'activité, et les images de profil des groupes de discussion en ligne sont très différentes. L'utilisateur peut utiliser les images de profil des groupes de discussion en ligne pour identifier rapidement un groupe de discussion en ligne devant être trouvé, ce qui permet d'améliorer l'efficacité de reconnaissance entre les groupes de discussion en ligne et d'améliorer l'expérience de l'utilisateur.
PCT/CN2021/125916 2020-08-31 2021-10-22 Procédé d'affichage d'image de profil et dispositif électronique WO2022042774A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010899499.X 2020-08-31
CN202010899499.XA CN114205318B (zh) 2020-08-31 2020-08-31 头像显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022042774A1 true WO2022042774A1 (fr) 2022-03-03

Family

ID=80352711

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/125916 WO2022042774A1 (fr) 2020-08-31 2021-10-22 Procédé d'affichage d'image de profil et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114205318B (fr)
WO (1) WO2022042774A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US20110271209A1 (en) * 2010-04-30 2011-11-03 American Teleconferncing Services Ltd. Systems, Methods, and Computer Programs for Providing a Conference User Interface
CN105681057A (zh) * 2016-02-22 2016-06-15 北京橙鑫数据科技有限公司 群头像更新方法及装置
CN105991405A (zh) * 2015-02-11 2016-10-05 腾讯科技(深圳)有限公司 一种即时通信的建立方法及装置
CN106209574A (zh) * 2016-06-17 2016-12-07 广州爱九游信息技术有限公司 基于即时通信软件的群头像显示系统、方法及电子设备
CN106534485A (zh) * 2016-10-12 2017-03-22 乐视控股(北京)有限公司 群组头像的设置方法及装置
CN108173742A (zh) * 2017-12-08 2018-06-15 腾讯科技(深圳)有限公司 一种图像数据处理方法、装置
CN109656656A (zh) * 2018-12-10 2019-04-19 上海掌门科技有限公司 用于生成群聊头像的方法和设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6772195B1 (en) * 1999-10-29 2004-08-03 Electronic Arts, Inc. Chat clusters for a virtual world application
US7433895B2 (en) * 2005-06-24 2008-10-07 Microsoft Corporation Adding dominant media elements to search results
US9619100B2 (en) * 2010-08-30 2017-04-11 Nokia Technologies Oy Method, apparatus, and computer program product for adapting a content segment based on an importance level
CN105430473A (zh) * 2015-11-30 2016-03-23 天脉聚源(北京)科技有限公司 显示支持者头像的方法和装置
CN105787982B (zh) * 2016-02-29 2018-11-09 腾讯科技(北京)有限公司 一种制作电子书的方法和装置
CN108196751A (zh) * 2018-01-08 2018-06-22 深圳天珑无线科技有限公司 群聊头像的更新方法、终端和计算机可读存储介质
CN110634168B (zh) * 2018-06-21 2023-09-12 钉钉控股(开曼)有限公司 群头像的生成方法及装置
CN110691027A (zh) * 2019-08-29 2020-01-14 维沃移动通信有限公司 一种信息处理方法、装置、电子设备及介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US20110271209A1 (en) * 2010-04-30 2011-11-03 American Teleconferncing Services Ltd. Systems, Methods, and Computer Programs for Providing a Conference User Interface
CN105991405A (zh) * 2015-02-11 2016-10-05 腾讯科技(深圳)有限公司 一种即时通信的建立方法及装置
CN105681057A (zh) * 2016-02-22 2016-06-15 北京橙鑫数据科技有限公司 群头像更新方法及装置
CN106209574A (zh) * 2016-06-17 2016-12-07 广州爱九游信息技术有限公司 基于即时通信软件的群头像显示系统、方法及电子设备
CN106534485A (zh) * 2016-10-12 2017-03-22 乐视控股(北京)有限公司 群组头像的设置方法及装置
CN108173742A (zh) * 2017-12-08 2018-06-15 腾讯科技(深圳)有限公司 一种图像数据处理方法、装置
CN109656656A (zh) * 2018-12-10 2019-04-19 上海掌门科技有限公司 用于生成群聊头像的方法和设备

Also Published As

Publication number Publication date
CN114205318B (zh) 2023-12-08
CN114205318A (zh) 2022-03-18

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
US11567623B2 (en) Displaying interfaces in different display areas based on activities
WO2021017889A1 (fr) Procédé d'affichage d'appel vidéo appliqué à un dispositif électronique et appareil associé
CN110114747B (zh) 一种通知处理方法及电子设备
WO2021213164A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021000807A1 (fr) Procédé et appareil de traitement pour un scénario d'attente dans une application
WO2020134869A1 (fr) Procédé de fonctionnement d'un dispositif électronique et dispositif électronique
CN110825469A (zh) 语音助手显示方法及装置
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
US20230351048A1 (en) Application Permission Management Method and Apparatus, and Electronic Device
WO2022033320A1 (fr) Procédé de communication bluetooth, équipement terminal et support d'enregistrement lisible par ordinateur
US11775135B2 (en) Application icon displaying method and terminal
WO2021052139A1 (fr) Procédé d'entrée de geste et dispositif électronique
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
WO2022042770A1 (fr) Procédé de commande d'état de service de communication, dispositif terminal et support de stockage lisible
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
US20240098354A1 (en) Connection establishment method and electronic device
CN114115770A (zh) 显示控制的方法及相关装置
CN114995715B (zh) 悬浮球的控制方法和相关装置
WO2024045801A1 (fr) Procédé de capture d'écran, dispositif électronique, support et produit programme
WO2022166435A1 (fr) Procédé de partage d'image et dispositif électronique
WO2022048453A1 (fr) Procédé de déverrouillage et dispositif électronique
WO2021052388A1 (fr) Procédé de communication vidéo et appareil de communication vidéo
WO2022062902A1 (fr) Procédé de transfert de fichier et dispositif électronique

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860606

Country of ref document: EP

Kind code of ref document: A1