CN112433996A - Data processing method and device and electronic equipment - Google Patents

Data processing method and device and electronic equipment Download PDF

Info

Publication number
CN112433996A
CN112433996A CN202011319628.XA CN202011319628A CN112433996A CN 112433996 A CN112433996 A CN 112433996A CN 202011319628 A CN202011319628 A CN 202011319628A CN 112433996 A CN112433996 A CN 112433996A
Authority
CN
China
Prior art keywords
target
data
user
level
compression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011319628.XA
Other languages
Chinese (zh)
Inventor
郑坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011319628.XA priority Critical patent/CN112433996A/en
Publication of CN112433996A publication Critical patent/CN112433996A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/1737Details of further file system functions for reducing power consumption or coping with limited storage space, e.g. in mobile devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/76Architectures of general purpose stored program computers
    • G06F15/78Architectures of general purpose stored program computers comprising a single central processing unit
    • G06F15/7807System on chip, i.e. computer system on a single chip; System in package, i.e. computer system on one or more chips in a single package
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/174Redundancy elimination performed by the file system
    • G06F16/1744Redundancy elimination performed by the file system using compression, e.g. sparse files

Abstract

The application discloses a data processing method, a data processing device and electronic equipment, wherein the method comprises the following steps: acquiring a user association level of a target user; determining a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user. The embodiment of the application can realize flexible management of the interactive data, and is favorable for optimizing the storage space of the equipment.

Description

Data processing method and device and electronic equipment
Technical Field
The application belongs to the technical field of communication, and particularly relates to a data processing method and device and electronic equipment.
Background
In using software of an electronic device, such as social software, a large amount of text, pictures (including emoticons), video, audio, and files are generally generated, and the data are generally stored in a memory of the electronic device.
At present, electronic equipment usually stores data directly according to the time of data generation, so that the occupied storage space is larger and larger along with the accumulation of time, the problems of slow operation, blockage and the like of the electronic equipment are caused, and even when the occupied space of the data exceeds the maximum storage space of the electronic equipment, new data cannot be stored. Therefore, the data processing method in the prior art has the problem of large occupied storage space.
Disclosure of Invention
The embodiment of the application aims to provide a data processing method, a data processing device and electronic equipment, and can solve the problem that a data processing mode in the prior art occupies a large storage space.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a data processing method, including:
acquiring a user association level of a target user;
determining a first compression level of the target user according to the user association level;
and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
In a second aspect, an embodiment of the present application provides a data processing apparatus, including:
the acquisition module is used for acquiring the user association level of the target user;
the determining module is used for determining a first compression level of the target user according to the user association level;
and the processing module is used for carrying out target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, the user association level of the target user is obtained; determining a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user. Therefore, the compression level corresponding to the target user can be set according to the user association level of the target user, and the target processing can be performed on the first interactive data according to the first compression level, so that the flexible management on the interactive data can be realized, and the optimization of the storage space of the equipment is facilitated.
Drawings
Fig. 1 is a flowchart of a data processing method provided in an embodiment of the present application;
FIG. 2 is a diagram illustrating a functional mapping relationship between a single influencing factor and a user association level;
FIG. 3 is a diagram illustrating a mapping relationship between a user association level and a compression level;
FIG. 4 is a schematic view of a browsing display of picture data;
FIG. 5 is a view of a page display of picture data;
FIG. 6 is a schematic diagram of a display after the picture data is completely decompressed;
fig. 7 is a block diagram of a data processing apparatus according to an embodiment of the present application;
fig. 8 is a block diagram of an electronic device provided in an embodiment of the present application;
fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The data processing provided by the embodiments of the present application will be described in detail through specific embodiments and application scenarios thereof with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flowchart of a data processing method provided in an embodiment of the present application, and as shown in fig. 1, the method includes the following steps:
step 101, obtaining a user association level of a target user.
The data processing method of the present embodiment is applied to a data processing apparatus or a control module for executing the data processing method in the data processing apparatus.
In this step, the target user may be a user who has interacted with the first user, that is, the target user may be an interactive user of the first user. The first user may be a user to which software of the data processing apparatus, such as social software, is logged in, and in an optional embodiment, the target user may be a user associated with the first user, that is, a user who joins a friend list of the first user in the social software.
The user association level of the target user may characterize the relative distance of the target user to the first user.
For example, if the target user is in a special group of the first user, or the remark of the target user is an interpersonal relationship user that is important for the first user, and the like, the user association level of the target user may be relatively high, and to a certain extent, it may be characterized that the relationship of the target user to the first user is relatively close, and may be a group of people that the first user relatively prefers. If the target user is in other groups of the first user, the user association level of the target user may be relatively low, which indicates that the relationship between the target user and the first user is relatively distant to some extent, and may be a crowd of which the first user is not aware.
For another example, if the number of interactions between the first user and the target user is large, or the number of interaction modes is large, or the chat is performed on the target user, or the dynamic data of the first user is only open to the target user, etc., the user association level of the target user may be relatively high, which may represent that the target user is relatively close to the relationship of the first user, and may be a group of people relatively preferred by the first user. If the number of interactions between the first user and the target user is small, or the number of interaction modes is small, or the target user is blackened or shielded, or the dynamic data of the first user is not open to the target user, etc., the user association level of the target user may be low, which represents that the relationship between the target user and the first user is relatively distant to some extent, and may be a crowd that the first user is not interested in.
The obtaining mode of the user association level of the target user can be various.
For example, for the associated user of the first user, the user association level of each associated user may be stored in advance according to information such as a group or a remark label of each associated user, and the user association level of each associated user may include multiple levels, where the levels may be from level 1 to level 5 from high to low, where level 1 indicates that the relationship with the first user is closest, and level 5 indicates that the relationship with the first user is farther. Correspondingly, the user association level of the target user is inquired from the pre-stored user association levels of all the associated users.
For another example, the user association level of the target user may be determined according to the interaction event between the first user and the target user, the user association level of the target user may be determined according to all interaction events since the first user interacts with the target user, or the user association level of the target user may be determined according to the interaction event between the first user and the target user in a preset time period, which is not specifically limited herein.
In an optional embodiment, since the interaction frequency and the interaction mode of the first user and the target user may be different in each time period, a time period may be set, for example, one month, and after the time period is exceeded, the user association level of the target user may be updated and labeled according to the interaction event between the first user and the target user in the previous time period. Of course, if there is more interaction between the first user and the target user in the current time period, the user association level of the target user may also be updated in time according to the interaction event of the current time period, and the tag may be marked.
The labels of the user association levels from high to low can be represented by important people p1, friendly people p2, general people p3, stranger people p4 and hate people p 5.
Of course, the determination of the user association level of the target user may also be determined by combining the interaction event between the first user and the target user and the information such as the group or the remark label where the target user is located, which will be described in detail below.
Specifically, the function of the user association level may be abstractly expressed as a multi-element function of the first user interaction with the user, as shown in the following formula (1).
F (interactive operation type, interactive frequency, degree of attention, degree of importance determined based on user relationship) (1)
Wherein x is usediThe above-mentioned various influencing factors are mutually independent and do not interfere with each otherThen, the above formula (1) can be simplified as shown in the following formula (2).
p=f(x1,x2,x3,x4)=α1f(x1)+α2f(x2)+α3f(x3)+α4f(x4) (2)
Wherein alpha isiThe weight of the ith parameter can be taken as the value range of p can be [0, 1%]。
For a single influencing factor xiThe mapping relation of the function p of the user association level can be represented as shown in fig. 2, and in combination with the actual situation, the value-taking schemes of the elements can be detailed as follows:
1)x1the method can represent the type of the interactive operation with the user, namely the interactive modes between the first user and the user, which can include text chat, voice video chat, calling and praise, and the like, and the interactive modes can increase x1And the more interaction means between the first user and the user during the last time period, x1The larger the value of (A) and the smaller the value of (B) otherwise.
2)x2Can represent the interaction frequency with the user and the contact tightness between the first user and the user, wherein the interaction frequency can be determined by the contact times with the user in the last time period, and the more the contact times, the more x2The larger the value of (A) and the smaller the value of (B) otherwise. In addition, the number of contacts may be a total number of text chatting, voice video chatting, making a call, and commenting on comments between the first user and the user.
x3Can represent the attention of the user, i.e. the attention of the first user to the user, can influence x3The content of the value may include: chat top setting, black pulling, shielding, opening to the user only or not, and the like, and x can be increased for positive attention such as chat top setting, opening to the user only and the like3The reverse concerns such as blacking, masking, and not opening to the user may reduce x3The value of (c).
x4May represent a degree of importance determined based on the user relationship, i.e., represent the user's weight to the first userTo a greater extent. The importance of which can be generally identified on the grouping and remark labels given to the user by the first user, e.g. the remarks of relatives such as parents or children, the remarks of interpersonal relationships important in work such as boss and important clients, x4The value of (A) is larger, otherwise x is larger4The value of (c) is relatively small.
Besides, the weight αiThe value of (2) is also important, and aiming at the influence factors of the four items on the user association level of the user, the weight cardinal number can be positioned to be 0.25 so as to ensure that the value result of the function p of the user association level falls to [0,1 ]]In the meantime. Of course, on the basis, small-amplitude adjustment can be performed according to the light and heavy degrees of various influence factors, but alpha is finally ensurediThe sum is 1.
And finally, calculating a function value of the user association level of the target user by combining the interaction event of the first user and the target user and information such as a group or remark label where the target user is located.
Then, the function values of the user association levels of the target users can be classified according to the following formula (3) and labeled.
Figure BDA0002792452230000061
For example, if the calculated function value of the user association level of the target user is between 0.6 and 0.8, the target user may be classified as a friendly group, the user association level is level 2, and the label is p2And the function value of the user association level of the category group may be defined as 0.75.
For another example, if the function value of the user association level of the target user is calculated to be between 0.4 and 0.6, the target user may be classified as a general group, the user association level is level 3, and the label is p3And the function value of the user association level of the category group may be defined as 0.5.
And 102, determining a first compression level of the target user according to the user association level.
In this step, the first compression level of the target user may refer to compression levels of all interactive data corresponding to the target user, and the interactive data corresponding to the target user may be interactive data of the first user and the target user.
The compression grade (represented by y) can represent the compression degree of the interactive data corresponding to the target user, the compression grades are different, the compression degree of the interactive data corresponding to the target user is also different, the compression grades can be divided into grade 1 to grade 5 according to the compression degree of the interactive data from low to high, the compression degrees of the data are respectively zero compression, light compression, medium compression, heavy compression and deletion from low to high, the compression degree is smaller, the compression grade is higher, the compression degree is larger, and the compression grade is lower.
Zero compression means that no compression operation is performed on the interactive data, with y1Representing, light compression represents light compression of interactive data by y2Meaning that the storage space occupied by the slightly compressed interactive data is reduced compared to the uncompressed interactive data. For example, by means of extracting feature values or reducing pixel density, the image data is lightly compressed, and the slightly compressed image can be enlarged and viewed, but compared with an uncompressed image, many details are lost, and generally, the uncompressed image is reduced by one order of magnitude.
Moderate compression means moderate compression of interactive data, using y3The storage space occupied by the interactive data after medium compression is further reduced compared with the interactive data after light compression, for example, the picture data is subjected to medium compression, only the preview is reserved, and the picture data becomes the picture data for preview, and all details of the picture are lost after the picture is enlarged.
Heavy compression means that interactive data is heavily compressed by y4And the representation shows that the storage space occupied by the heavily compressed interactive data is further reduced relative to the heavily compressed interactive data, for example, the picture data is heavily compressed, and is not reserved even in previewing, so that the picture data is changed into a picture identifier, and the occupation condition of the memory is reduced to the maximum extent.
Delete means delete interactive data completely, with y5And (4) showing.
The higher the compression degree of the interactive data is, the smaller the memory occupation of the compressed interactive data is. If the memory occupation of the picture data is 1M in order of magnitude, the memory occupation of the lightly compressed picture is 100KB in order of magnitude, the memory occupation of the moderately compressed preview picture is 10KB in order of magnitude, the preview picture is not generated after the severe compression, only the picture identification of 1KB in order of magnitude is reserved, and the picture data is compressed to the maximum extent.
The first compression level may be a set of compression levels, and the set may include one or more levels, for example, for interactive data corresponding to a target user, the interactive data corresponding to the target user is processed according to one compression level regardless of an interaction time of the interactive data. At this time, the first compression level includes only one compression level.
In this embodiment, the mapping relationship between the compression level and the function value y may take a value as shown in formula (4):
Figure BDA0002792452230000081
as can be seen from equation (4) above, the compression level for the interactive data is also different for different user association levels. For example, if the user association level of the target user is level 1, that is, the target user is a significant group with respect to the first user, the first compression level is a level with the lowest compression degree, that is, a compression level with the highest level, and is not compressed. For another example, if the user association level of the target user is level 2, i.e., the user is a friendly group with respect to the first user, the first compression level is a level with a moderate degree of compression.
In addition, from the practical point of view, for the friendly people and the important people, that is, the users with higher user association levels, the first user may not want to delete the interactive data with the user, so for the users with higher user association levels, the above equation (4) may be modified, and the modified equation may be represented by equation (5).
Figure BDA0002792452230000082
For another example, the interactive data corresponding to the target user is processed according to different compression levels according to the interaction time, for example, the interactive data of more than 12 months is compressed according to a severe compression level, and the interactive data within 6 months is compressed according to a moderate compression level. At this time, the first compression level may include a plurality of compression levels.
In this embodiment, the function of the compression level may be a multivariate function of the type of the interaction data, the user association level and the interaction time, as shown in equation (6).
y=f(si,pi,t) (6)
Wherein s isiThe value of i is 1, 2, 3, 4, 5, and 6, and the value of i may correspond to text data, picture data (including emoticons), audio data, video data, file data, and other data, respectively, wherein a tag of a data type may be previously marked on the interactive data according to the classification of the interactive data.
t is the interaction time, piFor the user association level, the influence of the generated interactive data in different time periods on the final compression result is different. In units of months, the influence of the interaction time t can be embodied, and the value can be taken as shown in formula (7).
Figure BDA0002792452230000091
As can be seen from equation (7), if the interactive data generated in the last month is the interactive data, the interactive data does not affect the data compression result in the time dimension, and the interactive data generated earlier has a greater effect on the compression degree as the interactive time becomes longer. That is, the compression levels of the interactive data for different interactive times may be different, and in this case, the first compression level may include a plurality of compression levels of the interactive data for different interactive times.
Further, for a particular data type, the label s of the data typeiIt can be uniquely determined, taking picture data as an example, a function y(s) of the compression level for the picture data2) This can be simplified as shown in the following formula (8).
Figure BDA0002792452230000092
After simplification, f (p)i) As shown in fig. 3, for a user with a relatively close relationship to the first user, the user may frequently look up related chat records, etc., the y value is relatively large, the compression degree representing the interactive data is relatively small, and the compression level is high. For users with distant relationships, the data of the users have smaller requirements for being consulted by the users, and the smaller the y value is, the greater the compression degree of the interactive data is represented, and the lower the compression level is.
For example, if the user association level of the target user is level 2, that is, the target user is a friendly crowd of the first user, and the label is p2The function value of which can be represented by the above formula (3), p20.75, at this point, the function y(s) of its compression level2) The value of (d) is calculated as shown in the following equation (9).
Figure BDA0002792452230000101
In combination with the above equation (5), the compression level and the mapping relationship of the function value y, the compression level of the interactive data considering the interactive time may be as shown in equation (10).
Figure BDA0002792452230000102
As shown in the above equation (10), the first compression level of the target user includes a level 1, which is a level with zero compression, a level 2, which is a level with light compression, a level 3, which is a level with moderate compression, and a level 4, which is a level with heavy compression.
For other types of interactive data, the compression level is determined in a manner similar to the above process, and will not be described herein again.
For another example, if the user association level of the target user is level 4, that is, the target user is an unfamiliar crowd of the first user, after the first user adds the friend to the target user, the first user may simply contact the friend once or twice and then does not contact the friend again, and the label of the first user is p4The function value of which can be represented by the above formula (3), p40.25. At this time, the function y(s) of the compression level of the character data2) The value of (d) is calculated as shown in the following equation (11).
Figure BDA0002792452230000103
As can be seen from equation (11) above, the first compression level includes a level at which the degree of compression is heavy, i.e., level 4, and a level at which the degree of compression reaches deletion, i.e., level 5.
And 103, performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
In this step, the first interactive data may be one or more pieces of data, the first interactive data may be data of a first user interacting with a target user at a certain time, the first interactive data may also be a plurality of pieces of data of the first user interacting with the target user within a preset time period, for example, within one month or within one month to three months, or the first interactive data may also be all pieces of data from the first user interacting with the target user.
In order to optimize the storage space of the data processing apparatus to the maximum extent, in the following embodiments, all data from the interaction of the first interaction data as the first user and the target user will be taken as an example for detailed description.
The first interactive data may include text data, picture data, video data, audio data, file data, and other data.
The management operation for the first interactive data may be determined according to the first compression level, and then the first interactive data may be correspondingly processed according to the determined management operation. The management operation includes a compression operation, a deletion operation, and no compression operation, and when the management operation is the compression operation, the first interactive data may be compressed, and when the management operation is the deletion operation, the first interactive data may be deleted, and when the management operation is the non-compression operation, the first interactive data may not be compressed.
Specifically, if the user association level of the target user is level 2, as can be seen from the above equation (10), the management operation is a compression operation in which the management operation is not performed for the interactive data with the target user within one month, the management operation is a compression operation in which the management operation is lightly compressed for the interactive data with the target user within one to three months, the management operation is a compression operation in which the management operation is moderately compressed for the interactive data with the target user within three to six months, and the management operation is a compression operation in which the management operation is heavily compressed for the interactive data with the target user for six months or more.
If the user association level of the target user is level 4, it can be seen from the above equation (11) that the management operation is a compression operation of heavily compressing the interactive data with the target user for one month, and the management operation is a deletion operation of the interactive data with the target user for one month or more.
Thereafter, in a case where it is determined that the compression processing or the deletion processing needs to be performed, corresponding processing may be performed on the interactive data according to the determined management operation.
Of course, if the interactive data is managed without any compression operation, the data processing device may not perform any compression processing on the interactive data, for example, the interactive data of the first user and the friendly people in one month, the data processing device may not perform any compression processing, and the other interactive data may be compressed. Therefore, when the storage space is optimized, the first user can not be influenced to quickly check the current precious interactive data, and the user experience can be improved.
In the embodiment, the user association level of the target user is obtained; determining a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user. Therefore, the compression level corresponding to the target user can be set according to the user association level of the target user, the management operation of the interactive data corresponding to the target user can be determined according to the compression level, and then the first interactive data can be correspondingly processed according to the determined management operation, so that the flexible management of the interactive data can be realized, and the optimization of the storage space of the equipment is facilitated.
Optionally, based on the first embodiment, the step 103 specifically includes:
determining interaction time corresponding to the first interaction data;
determining a target compression level corresponding to the first interactive data according to the first compression level and the interactive time;
and performing target processing on the first interactive data according to the target compression level.
In this embodiment, since the interaction time of the interaction data has different influences on the interaction data, the interaction time corresponding to the first interaction data may be determined in a specific implementation process.
The interaction time may be a time when the first user interacts with the target user, or may be a time when the first user browses the first interaction data again later, that is, after the interaction time, if the first user views the previous chat data, the interaction time corresponding to the first interaction data may be updated to the viewing time.
Then, a target compression level corresponding to the first interaction data may be determined according to the first compression level and the interaction time. Specifically, as shown in the above equations (10) and (11), it can be seen that, for the interaction time and the first compression level corresponding to the first interaction data, the management operation on the first interaction data is different, and correspondingly, the manner of processing the first interaction data is also different. For example, as shown in the above equation (10), the interaction time corresponding to the first interaction data is between one month and three months, and the target compression level is a compression level with a slightly compressed degree, whereas if the interaction time corresponding to the first interaction data is within one month, the target compression level is a compression level with an uncompressed degree.
When the target compression level is a compression level with an uncompressed degree, the management operation of the first interactive data is not to perform any compression operation, and accordingly, no compression processing may be performed on the first interactive data.
And under the condition that the target compression level is a compression level with a slightly compressed compression degree, the management operation is an operation of compressing according to a slightly compressed mode, and correspondingly, the first interactive data is compressed according to the slightly compressed mode.
And under the condition that the target compression level is a compression level with a moderate compression degree, the management operation is an operation of compressing according to a moderate compression mode, and correspondingly, the first interactive data is compressed according to the moderate compression mode.
And when the target compression level is a compression level with a heavily compressed degree, the management operation is an operation of compressing in a heavily compressed mode, and correspondingly, the first interactive data is compressed in the heavily compressed mode.
And when the target compression level is the compression level with the deleted compression degree, the management operation is the deletion operation, and correspondingly, the first interactive data is deleted.
And aiming at different types of the first interactive data, the light compression mode, the moderate compression mode and the heavy compression mode are different.
For example, for picture data, a mild compression mode may be a compression mode in which feature values are extracted and pixel density is reduced, a moderate compression mode may be a compression mode in which only picture preview is retained, and a severe compression mode may be a compression mode in which only picture identifiers are retained.
For video data and audio data, the compression rules may be the same, and for example, for video data, due to the fact that whether scene change of video is severe or not, some videos have low compressibility and some videos have high compressibility. Therefore, no matter what compressible degree of video, as for video data, only a label of light compression, medium compression or heavy compression is marked, the compression is carried out to the maximum extent, thereby ensuring the spatial compression rate.
For the character data, the mild compression, the moderate compression and the severe compression need different compression means, so that the space occupied by the compressed character data is reduced in sequence. When a user searches for the conventional chat records, the quick response degree of the search keyword is from high to low, namely for the slightly compressed character data, the search request can be responded more quickly, and for the heavily compressed character data, the response speed is slower than that of the slightly compressed character data and that of the moderately compressed character data.
In this embodiment, the interaction time corresponding to the first interaction data is determined; determining a target compression level corresponding to the first interactive data according to the first compression level and the interactive time; and determining the management operation on the first interactive data according to the target compression level, and then performing corresponding processing on the first interactive data according to the determined management operation. Therefore, different compression processing can be carried out on the interactive data of different interactive times from the time dimension by determining the management operation on the first interactive data of different interactive times, so that the flexible management on the interactive data can be further realized, the first user can not be influenced to quickly check the interactive data in the near future while the storage space is optimized, and the user experience is improved.
Optionally, the performing target processing on the first interaction data according to the target compression level includes:
under the condition that the target compression level is lower than the historical compression level of the first interactive data and higher than the lowest compression level, performing compression processing on the first interactive data according to the target compression level;
and under the condition that the target compression level is the lowest compression level, deleting the first interactive data.
In this embodiment, as the target compression level of the first interactive data is updated with the passage of time, for example, in month 5, the first interactive data is interactive data with an interactive time of last month to three months, and the target compression level is a compression level with a slightly compressed compression degree. In month 7, the first interactive data is interactive data with an interactive time of approximately three months to six months, the target compression level is a compression level with a moderate compression degree, the historical compression level of the first interactive data is a compression level with a light compression degree, the target compression level is lower than the historical compression level and higher than the lowest compression level, namely, a deleted compression level, and in this case, the first interactive data needs to be compressed on the basis of the light compression so as to only include the preview picture.
And directly deleting the first interactive data under the condition that the target compression level is the lowest compression level.
In this embodiment, as time goes on, the first interactive data is gradually compressed and even deleted, so that the problem that the occupied storage space is larger and larger as time goes on can be solved, the purpose of releasing a large amount of storage space is achieved, and the operating speed of the data processing device is increased.
Optionally, the step 101 specifically includes:
acquiring target information associated with the target user;
determining the user association level of the target user according to the target information;
wherein the target information comprises at least one of: interactive operation type, interactive frequency, degree of attention and degree of importance determined based on user relationship.
In this embodiment, the user association level of the target user may be based on the interaction event between the first user and the target user and information such as a group or a remark label where the target user is located.
Specifically, at least one of the interactive operation type, the interactive frequency, the attention degree, and the importance degree determined based on the user relationship of the target user may be obtained, and in an optional implementation, the function value of the user association level of the target user may be determined according to the above expression (2) based on the interactive operation type, the interactive frequency, the attention degree, and the importance degree determined based on the user relationship of the target user, and the user association level of the target user may be determined according to the above expression (3) based on the function value of the user association level of the target user.
For example, if the function value of the user association level of the target user is 0.7 and is between 0.6 and 0.8, the user association level of the target user is level 2, and the target user belongs to the friendly group of the first user.
In the embodiment, the user association level of the target user is determined through at least one of the interactive operation type, the interactive frequency, the degree of attention of the target user and the importance degree determined based on the user relationship, so that the relation between the first user and the target user can be well represented, the management operation on the first interactive data can be accurately determined based on the user association level of the target user, and then the first interactive data is correspondingly processed according to the determined management operation, so that the situation that the important data is deleted or the important data is rapidly checked by the user is prevented from being influenced can be avoided.
Optionally, after the step 103, the method further includes:
displaying target interaction data in a browsing interface under the condition that a first input of the first interaction data is received;
the target interactive data is at least part of the first interactive data, and the target interactive data comprises at least one of video data, picture data and text data.
In this embodiment, when the first user browses the past chat records to view the first interactive data, the first input is a browsing input of the first interactive data, or when the first user searches the first interactive data, the first input is a search input of the first interactive data.
In the case that a browsing input or a search input for the first interaction data is received, target interaction data may be displayed in the browsing interface, wherein the target interaction data may be at least part of the first interaction data.
Optionally, in a case that the target interaction data includes picture data, the displaying the target interaction data in the browsing interface includes:
displaying the picture data according to a target display state, wherein the target display state is associated with the current compression level of the picture data;
and when the picture data is in a compressed state, the target display state is a preview display state or a label display state.
The current compression level of the picture data may be a target compression level, or may be a compression level after a first-level decompression operation is performed, where the first-level decompression operation may refer to a decompression operation from heavy compression and decompression to medium compression, a decompression operation from medium compression and decompression to light compression, and a decompression operation from light compression and decompression to no compression.
The target display state is associated with the current compression level of the picture data, for example, when the picture data is in an uncompressed state, the target display state may be an original image with relatively high definition, when the picture data is in a slightly compressed state and a moderately compressed state, the target display state may be a preview display state, and when the picture data is in a heavily compressed state, the target display state may be a label display state, that is, only one picture identifier is displayed, which indicates that picture data exists at the position.
As shown in fig. 4, when the first user browses the chat log, the pictures 401 and 402 enter the screen, the picture 401 is located at the middle front position, and the picture 402 is located at the bottom of the screen just sliding out, and the loading policy of the pictures can be seen as follows: the picture 401 is slightly compressed and preview is reserved, and a user can view the general picture outline through the preview picture; and the picture 402 is loaded with only one picture identifier and is in a heavily compressed state.
For the character data, when a user searches keywords or browses past chat records, the slightly compressed character data can decompress and restore the data at the highest speed and then return to the browsing interface, while the heavily compressed data is decompressed at the lowest speed and finally restored and restored to the browsing interface.
For video data or audio data, when a user browses the chat log to view video, the video data or audio data for nearly five minutes can be decompressed in advance according to the current progress bar of the user.
Optionally, after the picture data is displayed according to the target display state, the method further includes:
receiving a second input to the browsing interface;
and responding to the second input, increasing the compression level of the picture data, and updating the display state of the picture data.
When the first user performs a sliding input in the browsing interface to browse the past chat records to view the picture, the data processing apparatus may decompress the compressed picture in response to the sliding input, and when the first user browses the chat records of a certain time period, the data processing apparatus may decompress the picture data of the stage in advance, increase the compression level of the picture data, and update the display state of the picture data to correspond to the compression level of the decompressed picture data.
The specific decompression step is as follows: the slightly compressed picture is completely decompressed, and a user can check the uncompressed complete picture by clicking the preview picture; for the picture with moderate compression, firstly decompressing to light compression, so that the user can see more details; for the heavily compressed picture, firstly decompressing to moderately compressing to enable the user to see the preview picture, and when the user clicks the preview picture, completely decompressing to present the complete picture to the user.
As shown in fig. 5, when the first user performs a slide input in the browsing interface, the picture 401 will display a prompt word "compress please later … …", and the picture 402 will load a preview picture with moderate compression for the user to view. After the picture 401 is loaded, the user clicks the picture 401, and the picture 401 is completely displayed, as shown in fig. 6.
In this embodiment, in a case where a first input to the first interactive data is received, the picture data is displayed according to a target display state, and a second input to the browsing interface is received; and responding to the second input, increasing the compression level of the picture data, and updating the display state of the picture data. Therefore, the interactive data are compressed to save the storage space, the user is not influenced to check the first interactive data, and the user experience is improved.
It should be noted that, in the data processing method provided in the embodiment of the present application, the execution main body may be a data processing apparatus, or a control module in the data processing apparatus for executing the data processing method. In the embodiment of the present application, a data processing apparatus executes a data processing method as an example, and the data processing apparatus provided in the embodiment of the present application is described.
Referring to fig. 7, fig. 7 is a structural diagram of a data processing apparatus according to an embodiment of the present application, and as shown in fig. 7, a data processing apparatus 700 includes:
an obtaining module 701, configured to obtain a user association level of a target user;
a determining module 702, configured to determine a first compression level of the target user according to the user association level;
a processing module 703 is configured to perform target processing on first interaction data according to the first compression level, where the first interaction data is interaction data corresponding to the target user.
Optionally, the processing module 703 includes:
the first determining unit is used for determining the interaction time corresponding to the first interaction data;
a second determining unit, configured to determine a target compression level corresponding to the first interactive data according to the first compression level and the interaction time;
and the processing unit is used for carrying out target processing on the first interactive data according to the target compression level.
Optionally, the processing unit is specifically configured to, under a condition that the target compression level is lower than a historical compression level of the first interactive data and the target compression level is higher than a lowest compression level, perform compression processing on the first interactive data according to the target compression level; and under the condition that the target compression level is the lowest compression level, deleting the first interactive data.
Optionally, the obtaining module 701 includes:
the acquisition unit is used for acquiring target information associated with the target user;
a third determining unit, configured to determine a user association level of the target user according to the target information;
wherein the target information comprises at least one of: interactive operation type, interactive frequency, degree of attention and degree of importance determined based on user relationship.
Optionally, the apparatus further comprises:
the display module is used for displaying target interaction data in a browsing interface under the condition that first input of the first interaction data is received;
the target interactive data is at least part of the first interactive data, and the target interactive data comprises at least one of video data, picture data and text data.
Optionally, the display module is specifically configured to display the picture data according to a target display state when the target interaction data includes the picture data, where the target display state is associated with a current compression level of the picture data; and when the picture data is in a compressed state, the target display state is a preview display state or a label display state.
Optionally, the apparatus further comprises:
the receiving module is used for receiving second input of the browsing interface;
an increase module for increasing a compression level of the picture data in response to the second input;
and the updating module is used for updating the display state of the picture data.
In this embodiment, the user association level of the target user is obtained through the obtaining module 701; determining, by a determining module 702, a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level through a processing module 703, where the first interactive data is interactive data corresponding to the target user. Therefore, the compression level corresponding to the target user can be set according to the user association level of the target user, and the target processing can be performed on the first interactive data according to the first compression level, so that the flexible management on the interactive data can be realized, and the optimization of the storage space of the equipment is facilitated.
The data processing device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The data processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The data processing apparatus provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
Optionally, referring to fig. 8, fig. 8 is a structural diagram of an electronic device according to an embodiment of the present application, as shown in fig. 8, an electronic device according to an embodiment of the present application is further provided, and includes a processor 801, a memory 802, and a program or an instruction stored in the memory 802 and capable of running on the processor 801, and when the program or the instruction is executed by the processor 801, the process of the data processing method embodiment is implemented, and the same technical effect can be achieved, and details are not repeated here to avoid repetition.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910.
Those skilled in the art will appreciate that the electronic device 900 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The processor 910 is configured to obtain a user association level of a target user; determining a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
In this embodiment, the processor 910 obtains the user association level of the target user; determining a first compression level of the target user according to the user association level; and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user. Therefore, the compression level corresponding to the target user can be set according to the user association level of the target user, and the target processing can be performed on the first interactive data according to the first compression level, so that the flexible management on the interactive data can be realized, and the optimization of the storage space of the equipment is facilitated.
Optionally, the processor 910 is further configured to determine an interaction time corresponding to the first interaction data; determining a target compression level corresponding to the first interactive data according to the first compression level and the interactive time; and performing target processing on the first interactive data according to the target compression level.
Optionally, the processor 910 is further configured to, when the target compression level is lower than a historical compression level of the first interaction data and the target compression level is higher than a lowest compression level, perform compression processing on the first interaction data according to the target compression level; and under the condition that the target compression level is the lowest compression level, deleting the first interactive data.
Optionally, the processor 910 is further configured to obtain target information associated with the target user; determining the user association level of the target user according to the target information;
wherein the target information comprises at least one of: interactive operation type, interactive frequency, degree of attention and degree of importance determined based on user relationship.
Optionally, the display unit 906 is configured to display the target interaction data in the browsing interface when the first input of the first interaction data is received;
the target interactive data is at least part of the first interactive data, and the target interactive data comprises at least one of video data, picture data and text data.
Optionally, the display unit 906 is further configured to display the picture data according to a target display state when the target interaction data includes the picture data, where the target display state is associated with a current compression level of the picture data;
and when the picture data is in a compressed state, the target display state is a preview display state or a label display state.
Optionally, the processor 910 is further configured to receive a second input to the browsing interface; and responding to the second input, increasing the compression level of the picture data, and updating the display state of the picture data.
It should be understood that, in the embodiment of the present application, the input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics Processing Unit 9041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes a touch panel 9071 and other input devices 9072. A touch panel 9071 also referred to as a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 909 can be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 910 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It is to be appreciated that the modem processor described above may not be integrated into processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the data processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the data processing method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. A data processing method, comprising:
acquiring a user association level of a target user;
determining a first compression level of the target user according to the user association level;
and performing target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
2. The method of claim 1, wherein the target processing of the first interaction data according to the first compression level comprises:
determining interaction time corresponding to the first interaction data;
determining a target compression level corresponding to the first interactive data according to the first compression level and the interactive time;
and performing target processing on the first interactive data according to the target compression level.
3. The method of claim 2, wherein the target processing of the first interaction data according to the target compression level comprises:
under the condition that the target compression level is lower than the historical compression level of the first interactive data and higher than the lowest compression level, performing compression processing on the first interactive data according to the target compression level;
and under the condition that the target compression level is the lowest compression level, deleting the first interactive data.
4. The method of claim 1, wherein the obtaining the user association level of the target user comprises:
acquiring target information associated with the target user;
determining the user association level of the target user according to the target information;
wherein the target information comprises at least one of: interactive operation type, interactive frequency, degree of attention and degree of importance determined based on user relationship.
5. The method of claim 1, wherein after the target processing of the first interaction data according to the first compression level, the method further comprises:
displaying target interaction data in a browsing interface under the condition that a first input of the first interaction data is received;
the target interactive data is at least part of the first interactive data, and the target interactive data comprises at least one of video data, picture data and text data.
6. The method of claim 5, wherein in the case that the target interaction data comprises picture data, the displaying the target interaction data in the browsing interface comprises:
displaying the picture data according to a target display state, wherein the target display state is associated with the current compression level of the picture data;
and when the picture data is in a compressed state, the target display state is a preview display state or a label display state.
7. The method of claim 6, wherein after displaying the picture data in the target display state, the method further comprises:
receiving a second input to the browsing interface;
and responding to the second input, increasing the compression level of the picture data, and updating the display state of the picture data.
8. A data processing apparatus, comprising:
the acquisition module is used for acquiring the user association level of the target user;
the determining module is used for determining a first compression level of the target user according to the user association level;
and the processing module is used for carrying out target processing on first interactive data according to the first compression level, wherein the first interactive data is interactive data corresponding to the target user.
9. The apparatus of claim 8, wherein the processing module comprises:
the first determining unit is used for determining the interaction time corresponding to the first interaction data;
a second determining unit, configured to determine a target compression level corresponding to the first interactive data according to the first compression level and the interaction time;
and the processing unit is used for carrying out target processing on the first interactive data according to the target compression level.
10. The apparatus according to claim 9, wherein the processing unit is specifically configured to, in a case that the target compression level is lower than a historical compression level of the first interaction data and the target compression level is higher than a lowest compression level, perform compression processing on the first interaction data according to the target compression level; and under the condition that the target compression level is the lowest compression level, deleting the first interactive data.
11. An electronic device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions, when executed by the processor, implementing the steps of the data processing method according to claims 1-7.
CN202011319628.XA 2020-11-23 2020-11-23 Data processing method and device and electronic equipment Pending CN112433996A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011319628.XA CN112433996A (en) 2020-11-23 2020-11-23 Data processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011319628.XA CN112433996A (en) 2020-11-23 2020-11-23 Data processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112433996A true CN112433996A (en) 2021-03-02

Family

ID=74693592

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011319628.XA Pending CN112433996A (en) 2020-11-23 2020-11-23 Data processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112433996A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009077271A (en) * 2007-09-21 2009-04-09 Ricoh Co Ltd Image processor, image processing system, image processing method, program, and storage medium
US20120057802A1 (en) * 2010-03-18 2012-03-08 Yasuhiro Yuki Data processing device and data processing method
CN107589910A (en) * 2017-09-01 2018-01-16 厦门集微科技有限公司 The method and system of the high in the clouds data management of user's custom strategies
CN109032506A (en) * 2018-06-27 2018-12-18 郑州云海信息技术有限公司 A kind of memory system data compression method, system and equipment and storage medium
CN111277274A (en) * 2020-01-13 2020-06-12 平安国际智慧城市科技股份有限公司 Data compression method, device, equipment and storage medium
CN111327764A (en) * 2020-01-20 2020-06-23 深圳传音控股股份有限公司 Information sharing method, terminal and readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009077271A (en) * 2007-09-21 2009-04-09 Ricoh Co Ltd Image processor, image processing system, image processing method, program, and storage medium
US20120057802A1 (en) * 2010-03-18 2012-03-08 Yasuhiro Yuki Data processing device and data processing method
CN107589910A (en) * 2017-09-01 2018-01-16 厦门集微科技有限公司 The method and system of the high in the clouds data management of user's custom strategies
CN109032506A (en) * 2018-06-27 2018-12-18 郑州云海信息技术有限公司 A kind of memory system data compression method, system and equipment and storage medium
CN111277274A (en) * 2020-01-13 2020-06-12 平安国际智慧城市科技股份有限公司 Data compression method, device, equipment and storage medium
CN111327764A (en) * 2020-01-20 2020-06-23 深圳传音控股股份有限公司 Information sharing method, terminal and readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
D. DONG等: ""Record-Aware Two-Level Compression for Big Textual Data Analysis Acceleration"", 《2015 IEEE 7TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING TECHNOLOGY AND SCIENCE (CLOUDCOM)》 *
王海艳;伏彩航;: "基于HBase数据分类的压缩策略选择方法", 通信学报, no. 04 *

Similar Documents

Publication Publication Date Title
CN113157906B (en) Recommendation information display method, device, equipment and storage medium
CN112486385A (en) File sharing method and device, electronic equipment and readable storage medium
CN112286887A (en) File sharing method and device and electronic equipment
CN112836086B (en) Video processing method and device and electronic equipment
CN112083854A (en) Application program running method and device
CN113325978B (en) Message display method and device and electronic equipment
CN112835859A (en) Information sharing method and device and electronic equipment
CN113114845A (en) Notification message display method and device
CN112416212A (en) Program access method, device, electronic equipment and readable storage medium
CN112035877A (en) Information hiding method and device, electronic equipment and readable storage medium
CN112269504A (en) Information display method and device and electronic equipment
US20060282788A1 (en) System and method for creating and utilizing context-sensitive popularity data
CN112433996A (en) Data processing method and device and electronic equipment
CN113364915B (en) Information display method and device and electronic equipment
CN112291412B (en) Application program control method and device and electronic equipment
CN113239212A (en) Information processing method and device and electronic equipment
CN113805997A (en) Information display method and device, electronic equipment and storage medium
CN112286615A (en) Information display method and device of application program
CN112882789A (en) Information display method and device, electronic equipment and storage medium
CN114928761B (en) Video sharing method and device and electronic equipment
CN112764553B (en) Chat expression collection method and device and electronic equipment
CN112596646B (en) Information display method and device and electronic equipment
CN112035032B (en) Expression adding method and device
CN111813285B (en) Floating window management method and device, electronic equipment and readable storage medium
CN116089474B (en) Data caching method, device, equipment and medium in custom editing mode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination