CN109753572B - Method and device for processing multimedia data - Google Patents

Method and device for processing multimedia data Download PDF

Info

Publication number
CN109753572B
CN109753572B CN201811603455.7A CN201811603455A CN109753572B CN 109753572 B CN109753572 B CN 109753572B CN 201811603455 A CN201811603455 A CN 201811603455A CN 109753572 B CN109753572 B CN 109753572B
Authority
CN
China
Prior art keywords
multimedia data
target
attribute information
data group
packet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811603455.7A
Other languages
Chinese (zh)
Other versions
CN109753572A (en
Inventor
赵言
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201811603455.7A priority Critical patent/CN109753572B/en
Publication of CN109753572A publication Critical patent/CN109753572A/en
Application granted granted Critical
Publication of CN109753572B publication Critical patent/CN109753572B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The embodiment of the invention provides a method and a device for processing multimedia data, which are applied to a mobile terminal, wherein the method comprises the following steps: detecting a plurality of multimedia data stored in a mobile terminal, and grouping the plurality of multimedia data to obtain one or more multimedia data groups; respectively merging each multimedia data group to obtain one or more merged packets; when a first operation of a user for a target merging packet is detected, the target merging packet is decomposed, and a multimedia data group after decomposition of the target merging packet is displayed. The embodiment of the invention realizes the control of the multimedia data in the mobile terminal, reduces the occupation of the multimedia data on the storage space and can quickly position the multimedia data expected by the user.

Description

Method and device for processing multimedia data
Technical Field
The present invention relates to the field of mobile terminal technologies, and in particular, to a method and an apparatus for processing multimedia data.
Background
At present, users store more and more data in mobile terminals, especially multimedia data such as images, and the storage space occupied by the multimedia data is increasing due to factors such as resolution.
On one hand, a user can expand the storage space from hardware, but the expansion cost on the hardware is high and the operation is difficult; on the other hand, the user can store the data in the mobile terminal into a network space, such as a cloud disk, but the network space has a problem of storage security, and the data can be used in the mobile terminal only after being downloaded from the network space, which is inconvenient.
Disclosure of Invention
The embodiment of the invention provides a method and a device for processing multimedia data, which aim to solve the problem that the multimedia data occupies a large storage space.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides a method for processing multimedia data, which is applied to a mobile terminal, and includes:
detecting a plurality of multimedia data stored in a mobile terminal, and grouping the plurality of multimedia data to obtain one or more multimedia data groups;
respectively merging each multimedia data group to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet;
when a first operation of a user for a target merging packet is detected, the target merging packet is decomposed, and a multimedia data group after decomposition of the target merging packet is displayed.
In a second aspect, an embodiment of the present invention further provides a device for processing multimedia data, which is applied to a mobile terminal, and includes:
the grouping module is used for detecting a plurality of multimedia data stored in the mobile terminal and grouping the plurality of multimedia data to obtain one or more multimedia data groups;
the merging module is used for merging each multimedia data group respectively to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet;
and the display module is used for decomposing the target merged packet and displaying the multimedia data group after the decomposition of the target merged packet when detecting the first operation of the user for the target merged packet.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the control on the multimedia data in the mobile terminal is realized, the occupation of the multimedia data on the storage space is reduced, and the multimedia data expected by the user can be quickly positioned by detecting the multimedia data stored in the mobile terminal, grouping the multimedia data to obtain one or more multimedia data groups, then respectively merging each multimedia data group to obtain one or more merged packets, decomposing the target merged packet and displaying the multimedia data group decomposed by the target merged packet when detecting the first operation of the user on the target merged packet.
Drawings
FIG. 1 is a flow chart of a method of multimedia data processing according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a user interface of an embodiment of the present invention;
FIG. 3 is a schematic diagram of another user interface of an embodiment of the present invention;
FIG. 4 is a schematic diagram of another user interface of an embodiment of the present invention;
FIG. 5 is a schematic view of another user interface of an embodiment of the present invention;
fig. 6 is a block diagram of an apparatus for multimedia data processing according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart of steps of a method for processing multimedia data according to an embodiment of the present invention is shown, and the method is applied to a mobile terminal, where the mobile terminal may include a smart Phone, a tablet computer, a notebook computer, a PDA (Personal Digital Assistant), and the like, and an operating system of the mobile terminal may include Android (Android), IOS, Windows Phone, Windows, and the like, which is not limited in this invention.
Specifically, the method can comprise the following steps:
step 101, detecting a plurality of multimedia data stored in a mobile terminal, and grouping the plurality of multimedia data to obtain one or more multimedia data groups;
as an example, the multimedia data may include images, videos, music, documents, and the like.
In practical applications, the mobile terminal may be detected comprehensively or under a specified path, a plurality of stored multimedia data, such as a plurality of images, may be determined, and then the plurality of multimedia data may be grouped to obtain one or more multimedia data groups.
In an embodiment of the present invention, step 101 may include the following sub-steps:
substep 11, determining attribute information corresponding to the plurality of multimedia data respectively;
as an example, the attribute information may include time information, location information, feature information.
The time information may be a time for generating the multimedia data, such as a time when the image is captured, the position information may be a position for generating the multimedia data, such as a position when the image is captured, and the feature information may be feature information recorded in the multimedia data, such as a face feature in the image.
After determining the plurality of multimedia data, attribute information corresponding to each multimedia data may be determined by analyzing a source file of the multimedia data.
And a substep 12 of determining similar attribute information in the plurality of attribute information and using the multimedia data corresponding to the similar attribute information as a multimedia data set.
When the similarity between any two pieces of attribute information is detected, the two pieces of attribute information can be determined to be similar attribute information, and then the multimedia data containing the similar attribute information can be divided into the same multimedia data group, for example, images shot at the same time and the same place can be divided into one group.
In an embodiment of the present invention, the substep 12 may comprise the substeps of:
when the attribute information comprises time information, determining attribute information of which the time difference corresponding to the time information is within a preset time range as similar attribute information; and/or when the attribute information comprises position information, determining the attribute information of which the position difference corresponding to the position information is within a preset position range as similar attribute information; and/or when the attribute information comprises the feature information, determining that the feature information comprises the attribute information of the same local feature as similar attribute information.
In an embodiment, the attribute information may include time information, and a time difference corresponding to the time information of any two pieces of multimedia data may be calculated, and when the time difference is within a preset time range, such as 10 seconds and 1 minute, the attribute information corresponding to the two pieces of time information may be determined to be similar attribute information.
In another embodiment, the attribute information may include location information, and then a location difference corresponding to the location information of any two pieces of multimedia data may be determined through networking, and when the location difference is within a preset location range, for example, 10KM, the attribute information corresponding to the two pieces of location information may be determined to be similar attribute information.
In another embodiment, the attribute information may include feature information, and then, feature analysis may be performed on any two pieces of multimedia data, and when the feature information corresponding to any two pieces of multimedia data includes the same local feature, such as the same face feature, and the same scene, animal, plant, season, and other features, it may be determined that the attribute information corresponding to the two pieces of feature information is similar attribute information.
Of course, the attribute information may also include other information, such as color information of an image, and then a color difference corresponding to the color information of any two multimedia data may be determined, and when the color difference is within a preset color difference range, then the attribute information corresponding to the two color information may be determined to be similar attribute information.
102, respectively combining each multimedia data group to obtain one or more combined packets;
and the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet.
After the multimedia data groups are determined, the multimedia data in each multimedia data group can be merged to obtain one or more merged packets.
In an example, the merging may include compression, such as zip compression. Of course, other manners of merging may also be adopted, for example, for multimedia data in the same multimedia data group, because of their similarities, a difference operation may be performed, that is, similar portions of multimedia data in the same multimedia data group are extracted and stored uniformly, and different portions of each multimedia data are stored separately.
In an embodiment of the present invention, before step 102, the user may preview the multimedia data set to detect whether the grouping is correct, and may enter an editing mode by long-pressing a certain multimedia data, and edit the multimedia data set in the editing mode, which may include the following steps:
receiving a second operation for the target multimedia data in the target multimedia data group; setting the target multimedia data as a cover image of the target multimedia data set in response to the second operation in a case where the second operation is a first target operation; in the case that the second operation is a second target operation, deleting the target multimedia data from the target multimedia data group in response to the second operation; and under the condition that the second operation is a third target operation, moving the target multimedia data from the target multimedia data group to other multimedia data groups in response to the second operation.
In one embodiment, each multimedia data set may generate a cover image, and when the multimedia data is an image, the cover image may be any image in the multimedia data set, and when the multimedia data is a video, the cover image may be a preview image of any video in the multimedia data set.
Specifically, the second operation may be the first target operation, for example, the user may replace the original cover image with the target multimedia data by pressing the target multimedia data in the target multimedia data set and then moving the target multimedia data to the cover area, for example, area 201 in fig. 2, to obtain a new cover image.
Of course, the user can also slide the target multimedia data in the target multimedia data group to the back/front to adjust the display position of the target multimedia data in the target multimedia data group.
In another embodiment, the second operation may be a second target operation, for example, the user may delete the target multimedia data from the target multimedia data set by pressing the target multimedia data in the target multimedia data set for a long time and then moving the target multimedia data to a deletion area, for example, area 301 in fig. 3.
In another embodiment, as shown in fig. 4, the second operation may be a third target operation, such as a user pressing the target multimedia data in the target multimedia data set, and then moving the target multimedia data out of the target multimedia data set and to another multimedia data set.
In an example, as shown in fig. 5, the user may check the multimedia data set, and then may compress/delete the checked multimedia data set.
And 103, when detecting a first operation of a user for a target merging packet, decomposing the target merging packet, and displaying the decomposed multimedia data set of the target merging packet.
Wherein merging may include compression operations and decomposing may include decompression operations.
After merging, each multimedia data group only displays a cover image, when the multimedia data in the merged package needs to be viewed, a user can select a target merged package, then decompose the target merged package, and then display the multimedia data groups in the target merged package.
In an example, to reduce the decompression duration, a number threshold and a size threshold may be set such that the multimedia data in each multimedia data group does not exceed the number threshold and the size of each merged packet does not exceed the size threshold.
In an embodiment of the present invention, after the step of decomposing the target merged packet, the method may further include the following steps:
and reserving the target merged packet.
In a specific implementation, the target merged package may be stored in a folder invisible to the user, and after the target merged package is decomposed, the original file of the target merged package may be retained, and only the target multimedia data set decompressed by the target merged package is displayed in a designated location, such as an album.
Correspondingly, after the step of presenting the decomposed multimedia data group of the target merged packet, the following steps may be included:
and deleting the multimedia data group decomposed by the target merging packet.
After the display is finished, the decomposed multimedia data group can be deleted from the designated position to clear the cache and save the storage space.
In the embodiment of the invention, the control on the multimedia data in the mobile terminal is realized, the occupation of the multimedia data on the storage space is reduced, and the multimedia data expected by the user can be quickly positioned by detecting the multimedia data stored in the mobile terminal, grouping the multimedia data to obtain one or more multimedia data groups, then respectively merging each multimedia data group to obtain one or more merged packets, decomposing the target merged packet and displaying the multimedia data group decomposed by the target merged packet when detecting the first operation of the user on the target merged packet.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 6, a block diagram of a multimedia data processing apparatus according to an embodiment of the present invention is shown, and the apparatus applied to a mobile terminal may specifically include the following modules:
a grouping module 601, configured to detect multiple multimedia data stored in a mobile terminal, and group the multiple multimedia data to obtain one or more multimedia data groups;
a merging module 602, configured to merge each multimedia data group to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet;
a displaying module 603, configured to decompose the target merged packet and display the decomposed multimedia data set of the target merged packet when a first operation of the user on the target merged packet is detected.
In an embodiment of the present invention, the grouping module 601 includes:
the attribute information determining submodule is used for respectively determining attribute information corresponding to the plurality of multimedia data;
and the multimedia data group is used as a submodule for determining similar attribute information in the attribute information and using the multimedia data corresponding to the similar attribute information as a multimedia data group.
In an embodiment of the present invention, the attribute information determining sub-module includes:
the first similar attribute information is used as a unit for determining attribute information of which the time difference corresponding to the time information is within a preset time range when the attribute information comprises the time information, and the attribute information is used as similar attribute information;
the second similar attribute information is used as a unit for determining attribute information of a position difference corresponding to the position information within a preset position range when the attribute information comprises the position information, and the attribute information is used as similar attribute information;
and the third similar attribute information is used as a unit for determining that the feature information contains attribute information of the same local feature as the similar attribute information when the attribute information includes the feature information.
In an embodiment of the present invention, the method further includes:
a second operation receiving module, configured to receive a second operation for the target multimedia data in the target multimedia data group;
a setting sub-module for setting the target multimedia data as a cover image of the target multimedia data group in response to the second operation in a case where the second operation is a first target operation;
a deletion submodule, configured to, in a case where the second operation is a second target operation, delete the target multimedia data from the target multimedia data group in response to the second operation;
and the moving submodule is used for responding to the second operation and moving the target multimedia data from the target multimedia data group to other multimedia data groups under the condition that the second operation is a third target operation.
In one embodiment of the present invention, the method includes:
the target merged packet retaining module is used for retaining the target merged packet;
and the multimedia data group deleting module is used for deleting the multimedia data group after the target merging packet is decomposed.
In an embodiment of the invention, the merging comprises a compression operation and the splitting comprises a decompression operation.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The apparatus/mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiment of fig. 1, and is not described herein again to avoid repetition.
Referring to fig. 7, a hardware structure diagram of a mobile terminal for implementing various embodiments of the present invention is shown.
The mobile terminal 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710, a power supply 711, and the like. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 710 is configured to detect a plurality of multimedia data stored in the mobile terminal, and group the plurality of multimedia data to obtain one or more multimedia data groups; respectively merging each multimedia data group to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet; when a first operation of a user for a target merging packet is detected, the target merging packet is decomposed, and a multimedia data group after decomposition of the target merging packet is displayed.
In the embodiment of the invention, the control on the multimedia data in the mobile terminal is realized, the occupation of the multimedia data on the storage space is reduced, and the multimedia data expected by the user can be quickly positioned by detecting the multimedia data stored in the mobile terminal, grouping the multimedia data to obtain one or more multimedia data groups, then respectively merging each multimedia data group to obtain one or more merged packets, decomposing the target merged packet and displaying the multimedia data group decomposed by the target merged packet when detecting the first operation of the user on the target merged packet.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 701 may be used for receiving and sending signals during a message transmission and reception process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 710; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 701 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 701 may also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access via the network module 702, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 703 may convert audio data received by the radio frequency unit 701 or the network module 702 or stored in the memory 709 into an audio signal and output as sound. Also, the audio output unit 703 may also provide audio output related to a specific function performed by the mobile terminal 700 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 703 includes a speaker, a buzzer, a receiver, and the like.
The input unit 704 is used to receive audio or video signals. The input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics processor 7041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 706. The image frames processed by the graphic processor 7041 may be stored in the memory 709 (or other storage medium) or transmitted via the radio unit 701 or the network module 702. The microphone 7042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 701 in case of a phone call mode.
The mobile terminal 700 also includes at least one sensor 705, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 7061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 7061 and/or a backlight when the mobile terminal 700 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 705 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 706 is used to display information input by the user or information provided to the user. The Display unit 706 may include a Display panel 7061, and the Display panel 7061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 707 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 707 includes a touch panel 7071 and other input devices 7072. The touch panel 7071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 7071 (e.g., operations by a user on or near the touch panel 7071 using a finger, a stylus, or any other suitable object or attachment). The touch panel 7071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 7071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 707 may include other input devices 7072 in addition to the touch panel 7071. In particular, the other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 7071 may be overlaid on the display panel 7061, and when the touch panel 7071 detects a touch operation on or near the touch panel 7071, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 7061 according to the type of the touch event. Although the touch panel 7071 and the display panel 7061 are shown in fig. 7 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 7071 and the display panel 7061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 708 is an interface through which an external device is connected to the mobile terminal 700. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 708 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 700 or may be used to transmit data between the mobile terminal 700 and external devices.
The memory 709 may be used to store software programs as well as various data. The memory 709 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 709 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 709 and calling data stored in the memory 709, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 700 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may enable managing charging, discharging, and power consumption by the power management system.
In addition, the mobile terminal 700 includes some functional modules that are not shown, and thus will not be described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 509, and a computer program stored in the memory 709 and capable of running on the processor 710, where the computer program, when executed by the processor 710, implements each process of the above-mentioned method for processing multimedia data, and can achieve the same technical effect, and is not described herein again to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the method for processing multimedia data according to the embodiment of the present invention, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A method for processing multimedia data, which is applied to a mobile terminal, comprises:
detecting a plurality of multimedia data stored in a mobile terminal, and grouping the plurality of multimedia data to obtain one or more multimedia data groups;
respectively merging each multimedia data group to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet;
when first operation of a user for a target merging packet is detected, decomposing the target merging packet, and displaying a multimedia data group of the decomposed target merging packet;
after the step of decomposing the target merged packet, the method comprises the following steps:
reserving the target merged packet;
after the step of presenting the decomposed multimedia data group of the target merged packet, the method comprises the following steps:
and deleting the multimedia data group decomposed by the target merging packet.
2. The method of claim 1, wherein the step of grouping the plurality of multimedia data into one or more multimedia data groups comprises:
respectively determining attribute information corresponding to the plurality of multimedia data;
and determining similar attribute information in the attribute information, and using the multimedia data corresponding to the similar attribute information as a multimedia data group.
3. The method of claim 2, wherein the step of determining similar attribute information of the plurality of attribute information comprises:
when the attribute information comprises time information, determining attribute information of which the time difference corresponding to the time information is within a preset time range as similar attribute information;
and/or when the attribute information comprises position information, determining the attribute information of which the position difference corresponding to the position information is within a preset position range as similar attribute information;
and/or when the attribute information comprises the feature information, determining that the feature information comprises the attribute information of the same local feature as similar attribute information.
4. The method according to claim 1 or 2, further comprising, before the step of combining each multimedia data group separately to obtain one or more combined packets:
receiving a second operation for the target multimedia data in the target multimedia data group;
setting the target multimedia data as a cover image of the target multimedia data set in response to the second operation in a case where the second operation is a first target operation;
in the case that the second operation is a second target operation, deleting the target multimedia data from the target multimedia data group in response to the second operation;
and under the condition that the second operation is a third target operation, moving the target multimedia data from the target multimedia data group to other multimedia data groups in response to the second operation.
5. The method of claim 1, wherein the merging comprises a compression operation and the decomposing comprises a decompression operation.
6. An apparatus for processing multimedia data, applied to a mobile terminal, comprising:
the grouping module is used for detecting a plurality of multimedia data stored in the mobile terminal and grouping the plurality of multimedia data to obtain one or more multimedia data groups;
the merging module is used for merging each multimedia data group respectively to obtain one or more merged packets; the size of the merged packet is smaller than that of the multimedia data group corresponding to the merged packet;
the display module is used for decomposing the target merged packet and displaying the decomposed multimedia data group of the target merged packet when detecting a first operation of a user for the target merged packet;
the target merged packet retaining module is used for retaining the target merged packet;
and the multimedia data group deleting module is used for deleting the multimedia data group after the target merging packet is decomposed.
7. The apparatus of claim 6, wherein the grouping module comprises:
the attribute information determining submodule is used for respectively determining attribute information corresponding to the plurality of multimedia data;
and the multimedia data group is used as a submodule for determining similar attribute information in the attribute information and using the multimedia data corresponding to the similar attribute information as a multimedia data group.
8. The apparatus of claim 7, wherein the attribute information determination sub-module comprises:
the first similar attribute information is used as a unit for determining attribute information of which the time difference corresponding to the time information is within a preset time range when the attribute information comprises the time information, and the attribute information is used as similar attribute information;
the second similar attribute information is used as a unit for determining attribute information of a position difference corresponding to the position information within a preset position range when the attribute information comprises the position information, and the attribute information is used as similar attribute information;
and the third similar attribute information is used as a unit for determining that the feature information contains attribute information of the same local feature as the similar attribute information when the attribute information includes the feature information.
9. The apparatus of claim 6 or 7, further comprising:
a second operation receiving module, configured to receive a second operation for the target multimedia data in the target multimedia data group;
a setting sub-module for setting the target multimedia data as a cover image of the target multimedia data group in response to the second operation in a case where the second operation is a first target operation;
a deletion submodule, configured to, in a case where the second operation is a second target operation, delete the target multimedia data from the target multimedia data group in response to the second operation;
and the moving submodule is used for responding to the second operation and moving the target multimedia data from the target multimedia data group to other multimedia data groups under the condition that the second operation is a third target operation.
10. The apparatus of claim 6, wherein the merging comprises a compression operation and the decomposing comprises a decompression operation.
CN201811603455.7A 2018-12-26 2018-12-26 Method and device for processing multimedia data Active CN109753572B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811603455.7A CN109753572B (en) 2018-12-26 2018-12-26 Method and device for processing multimedia data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811603455.7A CN109753572B (en) 2018-12-26 2018-12-26 Method and device for processing multimedia data

Publications (2)

Publication Number Publication Date
CN109753572A CN109753572A (en) 2019-05-14
CN109753572B true CN109753572B (en) 2021-03-23

Family

ID=66403174

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811603455.7A Active CN109753572B (en) 2018-12-26 2018-12-26 Method and device for processing multimedia data

Country Status (1)

Country Link
CN (1) CN109753572B (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5471903B2 (en) * 2010-07-01 2014-04-16 富士通株式会社 Information processing apparatus, image transmission program, and image display method
CN105684035B (en) * 2013-09-16 2019-08-20 英特尔公司 It is grouped and compresses similar photo
CN105120164B (en) * 2015-08-28 2016-10-19 努比亚技术有限公司 The processing means of continuous photo and method
CN105224671A (en) * 2015-10-14 2016-01-06 Tcl移动通信科技(宁波)有限公司 A kind of photo storage method and system based on mobile terminal
CN105787014A (en) * 2016-02-24 2016-07-20 吴江市创源电子有限公司 Method for expanding storage space of mobile terminal
CN107734336B (en) * 2016-08-10 2019-12-20 杭州海康威视数字技术股份有限公司 Compression method and device for video storage space
CN106843767A (en) * 2017-01-25 2017-06-13 上海摩软通讯技术有限公司 The memory space method for cleaning and mobile terminal of a kind of terminal
CN107087184B (en) * 2017-04-28 2020-05-22 华南理工大学 Multimedia data recompression method
CN108900696B (en) * 2018-05-29 2021-03-26 努比亚技术有限公司 Data processing method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN109753572A (en) 2019-05-14

Similar Documents

Publication Publication Date Title
CN108762954B (en) Object sharing method and mobile terminal
CN109078319B (en) Game interface display method and terminal
CN107846352B (en) Information display method and mobile terminal
CN107580147B (en) Management method of notification message and mobile terminal
CN108966004B (en) Video processing method and terminal
CN107977652B (en) Method for extracting screen display content and mobile terminal
CN108132752B (en) Text editing method and mobile terminal
CN109213416B (en) Display information processing method and mobile terminal
CN107784089B (en) Multimedia data storage method, processing method and mobile terminal
CN108052819B (en) Face recognition method, mobile terminal and computer readable storage medium
CN109412932B (en) Screen capturing method and terminal
CN108646960B (en) File processing method and flexible screen terminal
CN110855921B (en) Video recording control method and electronic equipment
CN109246474B (en) Video file editing method and mobile terminal
CN109144393B (en) Image display method and mobile terminal
CN111401463A (en) Method for outputting detection result, electronic device, and medium
CN111212316B (en) Video generation method and electronic equipment
CN110287719B (en) File encryption method and mobile terminal
CN111698550A (en) Information display method and device, electronic equipment and medium
CN109542321B (en) Control method and device for screen display content
CN109669656B (en) Information display method and terminal equipment
CN109669710B (en) Note processing method and terminal
CN111104380A (en) Application file management method and device, mobile terminal and storage medium
CN107819936B (en) Short message classification method, mobile terminal and storage medium
CN110213437B (en) Editing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant