CN115002537B - Video sharing method, electronic device, storage medium and program product - Google Patents
Video sharing method, electronic device, storage medium and program product Download PDFInfo
- Publication number
- CN115002537B CN115002537B CN202111592731.6A CN202111592731A CN115002537B CN 115002537 B CN115002537 B CN 115002537B CN 202111592731 A CN202111592731 A CN 202111592731A CN 115002537 B CN115002537 B CN 115002537B
- Authority
- CN
- China
- Prior art keywords
- video
- storage path
- target
- transcoding
- file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000015654 memory Effects 0.000 claims description 36
- 238000004590 computer program Methods 0.000 claims description 8
- 239000002699 waste material Substances 0.000 abstract description 4
- 230000006870 function Effects 0.000 description 34
- 238000004891 communication Methods 0.000 description 33
- 230000006854 communication Effects 0.000 description 33
- 230000008569 process Effects 0.000 description 21
- 238000010586 diagram Methods 0.000 description 13
- 230000005236 sound signal Effects 0.000 description 9
- 238000010295 mobile communication Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000011144 upstream manufacturing Methods 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4335—Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Telephone Function (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
The embodiment of the application provides a video sharing method, electronic equipment, a storage medium and a program product, and relates to the technical field of terminals, wherein the method comprises the following steps: receiving sharing operation aiming at a target video, wherein the target video is obtained by renaming an original video; responding to the sharing operation, based on a first corresponding relation recorded by the electronic equipment, searching a first storage path of a target transcoding file corresponding to the target video according to a video storage path of the target video, wherein the first storage path is as follows: a storage path of the transcoded video corresponding to the original video; accessing a first storage path, and determining whether a target transcoding file exists; and determining that the target transcoding file exists, sending the target transcoding file to the application program, and sharing the target transcoding file. By applying the embodiment of the application, the waste of computing resources of the electronic equipment caused by repeated video transcoding can be prevented.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a video sharing method, an electronic device, a storage medium, and a program product.
Background
In the prior art, when an application program needs to share video, the android system can determine whether the application program supports opening video in HEVC (High Efficiency Video Coding ) format, if the android system determines that the application program does not support opening video in HEVC format, the video can be transcoded into transcoded files in AVC format that most application programs support opening, and then the transcoded files are sent to the application program, so that the application program can open and share transcoded files in AVC format.
If the video in the HEVC format is renamed, because the name of the renamed video is different from that of the original video, if the application program needs to share the renamed video, the android system firstly transcodes the renamed video and then sends a transcoded file to the application program after judging that the application program does not support the video in the HEVC format. However, in practice, the renamed video and the original video have different video names, the content is identical, the content of the transcoded file obtained after transcoding the renamed video and the original video is identical, and in order to enable the application program to share the renamed video, the android system must transcode the video again, so that the problem of repeated transcoding of the video is caused, and the repeated transcoding of the video causes the waste of computing resources of the electronic equipment.
Disclosure of Invention
In view of the foregoing, the present application provides a video sharing method, an electronic device, a storage medium, and a program product, so as to prevent the electronic device from wasting computing resources caused by repeated video transcoding.
In a first aspect, an embodiment of the present application provides a video sharing method, which is applied to an electronic device, and the method includes:
receiving a sharing operation for a target video, wherein the sharing operation is used for: indicating to share the target video through a designated application program, wherein the target video is obtained by renaming an original video;
responding to the sharing operation, and based on a first corresponding relation recorded by the electronic equipment, searching a first storage path of a target transcoding file corresponding to a target video according to a video storage path of the target video, wherein the first corresponding relation is as follows: the corresponding relation between a video storage path of a video and a storage path of a transcoding file corresponding to the video is that: a storage path of the transcoded video corresponding to the original video;
accessing the first storage path, and determining whether the target transcoding file exists;
and determining that the target transcoding file exists, sending the target transcoding file to the application program, and sharing the target transcoding file.
In one embodiment of the present application, the method further comprises:
after renaming the first video, obtaining a storage path of a transcoding file corresponding to the first video;
and recording the corresponding relation between the video storage path of the renamed video and the obtained storage path as a first corresponding relation.
In one embodiment of the present application, the target video is:
directly renaming the original video to obtain a video;
or (b)
And copying the original video, and renaming the copied video to obtain a copy video.
In an embodiment of the present application, responding to the sharing operation, based on a first correspondence recorded by the electronic device, and according to a video storage path of a target video, searching a first storage path of a target transcoding file corresponding to the target video includes:
and responding to the sharing operation, and searching a first storage path of a target transcoding file corresponding to the target video based on a first corresponding relation recorded by the electronic equipment under the condition that the target video is determined to be transcoded.
In one embodiment of the present application, a second correspondence between a storage path of a transcoded file and a reference number of the transcoded file is recorded in the electronic device, where the reference number represents: the number of videos before transcoding corresponding to the transcoding file, the method further comprises:
Receiving a processing operation for a second video;
if the processing operation indicates to copy the second video, after the second video is copied and renamed to obtain a copy video, a second storage path of a transcoding file corresponding to the second video is obtained, and 1 is added on the basis of the reference number with a second corresponding relation with the second storage path;
and if the processing operation indicates to delete the second video, after the second video is deleted, acquiring a second storage path of the transcoding file corresponding to the second video, and subtracting 1 on the basis of the reference number with a second corresponding relation with the second storage path.
In one embodiment of the present application, after the obtaining the second storage path of the transcoded file corresponding to the second video and subtracting 1 based on the reference number corresponding to the second storage path, the method further includes:
and if the adjusted reference number is 0, deleting the transcoding file stored at the storage path corresponding to the reference number based on the second corresponding relation.
In an embodiment of the present application, the obtaining the second storage path of the transcoded file corresponding to the second video includes:
and searching a storage path of the transcoding file corresponding to the video storage path of the second video based on the recorded first corresponding relation, and taking the storage path as a second storage path.
In one embodiment of the present application, the second correspondence is stored in a vector array, where each item of data in the vector array is a format of a key-value pair, a key is a storage path of a transcoded file, and a value is a reference number of the transcoded file.
In a second aspect, embodiments of the present application provide an electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the steps of any of the first aspects.
In a third aspect, an embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium includes a stored program, where when the program runs, the program controls a device in which the computer readable storage medium is located to execute the method of any one of the first aspects.
In a fourth aspect, embodiments of the present application provide a computer program product comprising executable instructions which, when executed on a computer, cause the computer to perform the method of any one of the first aspects.
By adopting the technical scheme provided by the embodiment of the application, after receiving the sharing operation of the target video, if the target video is determined to need to be transcoded, the first storage path of the target transcoding file corresponding to the target video can be searched from the first corresponding relation, the target transcoding file is a file obtained after transcoding the original video, and the target transcoding file can also be used as the transcoding file of the target video because the target video is obtained after renaming the original video. And determining whether the target transcoding file exists or not based on the first storage path, and if the target transcoding file exists, directly sending the target transcoding file to the application program, so that the application program can share the target transcoding file. Therefore, when the application program needs to share the target file obtained after renaming, if the target transcoding file of the original video exists in the electronic equipment, the application program can directly share the target transcoding file without repeated transcoding of the target video, so that the waste of computing resources of the electronic equipment caused by repeated video transcoding can be prevented.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application;
fig. 3A is a schematic diagram of a first video sharing application scenario provided in an embodiment of the present application;
fig. 3B is a schematic diagram of a second video sharing application scenario provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a third video sharing application scenario provided in an embodiment of the present application;
fig. 5 is a flow chart of a video sharing method according to an embodiment of the present application;
fig. 6 is a flowchart of a first method for managing a transcoded file according to an embodiment of the present application;
fig. 7A is a schematic diagram of a replicated video application scenario provided in an embodiment of the present application;
fig. 7B is a schematic diagram of a deleted video application scenario provided in an embodiment of the present application
Fig. 8 is a flowchart of a second method for managing a transcoded file according to an embodiment of the present application;
fig. 9 is a call timing diagram of internal software of an electronic device according to an embodiment of the present application.
Detailed Description
For a better understanding of the technical solutions of the present application, embodiments of the present application are described in detail below with reference to the accompanying drawings.
In order to clearly describe the technical solutions of the embodiments of the present application, in the embodiments of the present application, the words "first", "second", etc. are used to distinguish the same item or similar items having substantially the same function and effect. For example, the first instruction and the second instruction are for distinguishing different user instructions, and the sequence of the instructions is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
In this application, the terms "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
The embodiment of the application can be applied to electronic devices such as tablet computers, personal computers (personal computer, PCs), personal digital assistants (personal digital assistant, PDAs), smart watches, netbooks, wearable electronic devices, augmented reality (augmented reality, AR) devices, virtual Reality (VR) devices, vehicle-mounted devices, smart automobiles, robots, smart glasses, smart televisions and the like.
It should be noted that, in some possible implementations, the electronic device may also be referred to as a terminal device, a User Equipment (UE), or the like, which is not limited by the embodiments of the present application.
As shown in fig. 1, fig. 1 is a schematic diagram of an electronic device provided in an embodiment of the present application, where the electronic device shown in fig. 1 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (Universal Serial Bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (Subscriber Identity Module, SIM) card interface 195. Among them, the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, and a bone conduction sensor 180M, etc.
It should be understood that the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device. In other embodiments of the present application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (Application Processor, AP), a modem processor (modem), a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, and/or a neural-Network Processor (NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The processor 110 may generate operation control signals according to the instruction operation code and the timing signals to complete instruction fetching and instruction execution control.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (Inter-Integrated Circuit, I2C) interface, an integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, a pulse code modulation (Pulse Code Modulation, PCM) interface, a universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interface, a mobile industry processor interface (Mobile Industry Processor Interface, MIPI), a General-Purpose Input/Output (GPIO) interface, and a subscriber identity module (Subscriber Identity Module, SIM) interface.
The I2C interface is a bi-directional synchronous Serial bus, comprising a Serial Data Line (SDA) and a Serial clock Line (Derail Clock Line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement a touch function of the electronic device.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. The audio module 170 may transmit the acquired downstream audio stream data and upstream audio stream data to an electronic device wirelessly connected to the electronic device through the wireless communication module 160.
In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, so as to implement a function of obtaining a downstream audio stream through a bluetooth-connected electronic device.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interface includes camera serial interface (Camera Serial Interface, CSI), display serial interface (Display Serial Interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display screen 194 communicate via a DSI interface to implement the display functionality of the electronic device.
It should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G or the like for use on the first electronic device. In some embodiments, the transmission of call data between two electronic devices may be accomplished through the mobile communication module 150, for example, as a called party device, downstream audio stream data from the calling party device may be obtained, and upstream audio stream data may be transmitted to the calling party device.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (Wireless Fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field wireless communication technology (Near Field Communication, NFC), and infrared technology (IR) for application on electronic devices.
In some embodiments, the antenna 1 and the mobile communication module 150 of the electronic device are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the electronic device can communicate with the network and other devices through wireless communication technology. In one embodiment of the present application, the electronic device may implement a local area network connection with another electronic device through the wireless communication module 160. Wireless communication techniques may include global system for mobile communications (Global System for Mobile Communications, GSM), general packet radio service (General Packet Radio Service, GPRS), code Division multiple access (Code Division Multiple Access, CDMA), wideband code Division multiple access (Wideband Code Division Multiple Access, WCDMA), time Division-synchronous code Division multiple access (Time-Division-Synchronous Code Division Multiple Access, TD-SCDMA), long term evolution (Long Term Evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (Global Positioning System, GPS), a global navigation satellite system (Global Navigation Satellite System, GLONASS), a Beidou satellite navigation system (Beidou Navigation Satellite System, BDS), a Quasi zenith satellite system (Quasi-Zenith Satellite System, QZSS), and/or a satellite based augmentation system (Satellite Based Augmentation System, SBAS), among others.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), an Active-matrix or Active-matrix Organic Light-Emitting Diode (AMOLED), a flexible Light-Emitting Diode (Flex Light-Emitting Diode), a MiniLED, microLED, micro-OLED, a quantum dot Light-Emitting Diode (Quantum dot Light Emitting Diode, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 194, N being a positive integer greater than 1.
The external memory interface 120 may be used to connect external memory cards, such as Micro secure digital (Secure Digital Memory, SD) cards, to enable expansion of the memory capabilities of the electronic device. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. Files such as music, video, audio files, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an operating system, and application programs (such as a sound playing function, an image playing function, and a recording function) required for at least one function, etc. The storage data area may store data created during use of the electronic device (e.g., upstream audio data, downstream audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The electronic device may implement a call conflict handling function, etc. through the audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone interface 170D, application processor, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When the electronic device picks up a call or voice message, the voice transmitted by the caller device may be heard through the listener 170B.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some embodiments, the manual answer call function may be implemented when the user clicks an answer key on the display screen 194, and the manual hang-up call function may be implemented when the user clicks a hang-up key on the display screen 194.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor 180K may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device at a different location than the display 194.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device. The electronic device may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic equipment interacts with the network through the SIM card, so that the functions of communication, data communication and the like are realized. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
The software system of the electronic device can adopt a layered structure, and the embodiment of the application uses the layered structure of an Android (Android) system as an example to illustrate the software structure of the electronic device.
Fig. 2 is a software structural block diagram of an electronic device according to an embodiment of the present application, as shown in fig. 2.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, namely an application layer, a framework layer, a hardware abstraction layer and a hardware layer from top to bottom.
The Application layer (App) may include a series of Application packages. For example, the application layer may include gallery, camera, mail, calendar, talk, bluetooth, music, video, weChat, and SMS applications.
The camera can be used for recording videos and storing the recorded videos in the gallery.
The gallery has the functions of sharing, collecting and the like.
The sharing function refers to sharing videos in the gallery to other electronic devices through communication application programs in the electronic devices, and the communication application programs refer to application programs capable of performing information interaction with other electronic devices, such as mail, weChat and the like.
The collection function is used for users to collect videos in a gallery, and the collected videos can be stored in a single folder, such as a personal collection folder in the gallery, so that the user can conveniently view or share the videos.
The Framework layer (FWK) provides an application programming interface (application programming interface, API) and programming Framework for the application layer's applications, including some predefined functions.
The framework layers include a content parser (contentdeson), a user space daemon (FuseDaemon), a media provider (MediaProvider), a transcoding helper (transcodel), a media transcoding manager (MediaTranscodeManager), a video track transcoder (videotranscoder), and a media codec (MediaCodec).
Wherein the content parser is used to obtain video files from the media provider that are needed by the application.
The file system (filesystem in userspace, fuse) of the user space is a file system implemented in a user state. The user space daemon is used for receiving and processing access operations of other modules to the Fuse.
The media provider is used to find video files required by the application.
The transcoding assistant is used for being called by the media provider to judge whether the video needs to be transcoded.
The media transcoding manager is used for calling the video track transcoder to transcode the video if the transcoding assistant determines that the video needs to be transcoded.
The video track transcoder is used to invoke the media codec to transcode the video.
The media codec is a class provided by Android for encoding and decoding audio and video, and realizes the function of encoding and decoding by accessing the codec of the bottom layer, and is a part of the Android media basic framework.
A Hardware Abstraction Layer (HAL) is an interface layer located between the operating system kernel and the hardware circuitry, which aims at abstracting the hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms.
The HardWare layer (HardWare, HW) is the HardWare that is located at the lowest level of the operating system. In fig. 2, HW includes a camera 1, a camera 2, a camera 3, and the like. The cameras 1, 2, 3 may correspond to a plurality of cameras on the electronic device.
The embodiment of the application can be applied to the APP sharing the video scene stored in the electronic equipment. Specifically, the video stored in the electronic device may be obtained by shooting through a camera of the electronic device, or may be a video obtained by the electronic device through other modes.
Fig. 3A is a schematic view of a first video sharing application scenario provided in the embodiment of the present application, as shown in fig. 3A, in a scenario in which two electronic devices share video through a communication APP, for example, the electronic devices are taken as mobile phones, as shown in fig. 3A, in a chat interface of the APP installed in the mobile phone 1, a user may select a photo button, and then the mobile phone may display a gallery interface. And the user selects the video to be shared in the gallery interface, and clicks a sending button, so that the video selected by the user is uploaded (uploaded) to a third party server, and the third party server sends the video to the APP installed on the mobile phone 2. That is, the APP of the mobile phone 2 downloads (downloads) the video from the third party server, the video will be displayed in the chat interface of the APP of the mobile phone 2, and the APP of the mobile phone 2 can play the video.
Fig. 3B is a schematic view of a second video sharing application scenario provided in the embodiment of the present application, where, as shown in fig. 3B, when a user shares a video in a gallery of a mobile phone, a video may be selected in a gallery interface of the mobile phone 1, and a video with a right upper corner in fig. 3B shows a sign as a video selected by the user.
Then the user selects the sharing button, jumps to the sharing interface, then the user selects an APP (application) on the sharing interface, such as WeChat, selects and sends the APP to friends, selects a friend from the friend list, then a prompt box is displayed to confirm whether to send the video to the selected friend, if the user clicks the determination button, the video selected by the user is uploaded to a third party server, the third party server sends the video to the APP of the mobile phone 2, and then the APP of the mobile phone 2 can play the video.
The third party server is a server for supporting APP communication in different mobile phones, and the embodiment of the present application is not limited to the third party server.
It should be noted that the above-mentioned video sharing process is only some possible implementations illustrated in the embodiments of the present application, and the implementation of the present application is not limited to the specific implementation of the video sharing process.
Specifically, the electronic device may capture video in a default format and store the captured video in a gallery, for example, the electronic device configured with the android 12 system may capture video in an HEVC format, and store the captured video in the gallery, where the HEVC format is a video coding format under the h.265 standard.
As shown in fig. 4, when the APP in the mobile phone 1 shares the HEVC video in the gallery, for example, when the user clicks the send button in fig. 3A, or when the user clicks the determine button in fig. 3B, the APP is triggered to share the HEVC video.
After receiving the operation of the APP to share the video in HEVC format, the media provider may invoke a transcoding assistant to determine whether the APP supports opening the video in HEVC format.
If so, the video in the HEVC format can be directly returned to the APP, if not, a transcoding flow can be started, the video in the HEVC format is transcoded into the video in the AVC format by a media codec, and the video in the AVC format is returned to the APP.
After the APP acquires the video, the acquired video is uploaded (uploaded) to a third party server, and the third party server sends the video to the APP in the mobile phone 1, that is, the APP of the mobile phone 1 downloads (downloads) the video from the third party server, and plays the video.
If the video in the HEVC format is renamed, because the name of the renamed video is different from that of the original video, the media provider performs transcoding operation again when the APP does not support the video in the HEVC format, that is, transcodes the video in the HEVC format into the video in the AVC format, and returns the video in the AVC format to the APP.
In the above process, because transcoding is an operation with extremely high computational overhead, the HEVC video with a duration of 60 seconds needs about 20 seconds to be transcoded into AVC format on a google smart phone (pixel), and after a user operates to share the video, the user needs to wait for a long time, and after the electronic device transcodes the video, the video sharing process can be completed. If the video is renamed, when the user operates to share the renamed video, the user also needs to wait for a long time, and after the electronic device transcodes the renamed video, the sharing process of the renamed video can be completed.
In order to solve the above problems, an embodiment of the present application provides a video sharing method, which may be applied to the electronic device shown in fig. 1. Fig. 5 is a schematic flow chart of a video sharing method according to an embodiment of the present invention, where the method includes the following steps S501 to S504.
S501: a sharing operation for the target video is received.
The content parser in the electronic device framework layer may receive the sharing operation, where the sharing operation is used to: indicating that the target video is shared by the designated application.
For example, the sharing operation of the application program on the target video may be triggered when the user clicks the send button in fig. 3A or when the user clicks the determine button in fig. 3B, which is not limited in this embodiment of the present application. The user can trigger the sharing operation of the application program on the target video through touch screens, gesture control, voice control and the like.
The application program can be a third party APP (application) installed in the electronic equipment and except for a gallery, such as WeChat, mailbox and the like, and any APP with a video sharing function.
In addition, the target video is a video obtained by renaming an original video. The renaming comprises directly renaming the original video to obtain a target video, or copying the original video and renaming the copied video to obtain the target video.
Whether the target video is obtained by directly renaming the original video or by copying the original video and renaming the copied video, the difference between the target video and the original video is only that the video names are different, and the video contents of the target video and the original video are the same.
S502: and responding to the sharing operation, and searching a first storage path of a target transcoding file corresponding to the target video according to the video storage path of the target video based on the first corresponding relation recorded by the electronic equipment.
After receiving the sharing operation, the content analyzer in the electronic device forwards the sharing operation to the media provider, and the media provider invokes a transcoding assistant to determine whether the target video needs to be transcoded.
The first correspondence relationship is as follows: correspondence between a video storage path of a video and a storage path of a transcoded file corresponding to the video.
Specifically, the electronic device may include an attribute information database, where each line in the attribute information database records attribute information of one video, where the attribute information includes the following information: video number, file name of video, video storage path, storage path of transcoded file corresponding to video, etc. Here, the correspondence between the video storage path in each line and the storage path of the transcoded file to which the video corresponds is referred to as a first correspondence, and therefore, there is a first correspondence corresponding thereto for each target video, and there are a plurality of first correspondences for the attribute information database.
The numbers of different videos are different, and the video numbers can be customized, or can be line numbers in an application attribute information database, for example, the numbers of the videos can be: the attribute information of the video is the line number of the line in the attribute information database, and at this time, the numbers of different videos are different.
In one embodiment of the present application, when a target video needs to be shared, a video storage path of the target video belongs to known information, and when a first storage path is searched, a storage path of a transcoding file recorded in a line is used as the first storage path by searching a line where the video storage path of the target video exists in an attribute information database, that is, according to the video storage path of the target video, the first storage path is obtained according to a first corresponding relationship.
Furthermore, the first storage path is: and a storage path of the transcoded video corresponding to the original video.
The format of the original frequency transcoded file is a first format that the application program can open, for example, the first format is AVC format, and AVC format is a video coding format under h.264 standard.
S503: and accessing the first storage path to determine whether the target transcoding file exists.
The first storage path of the electronic device may or may not store the target transcoded file, and only if the target transcoded file actually exists, the application program can directly open and share the target transcoded file, so that it needs to determine whether the target transcoded file actually exists before sharing the target transcoded file.
In one embodiment of the present application, an iscodefile function may be called to determine whether the target transcoding file exists, where a parameter of the iscodefile function is the first storage path.
If the target transcoding file exists, step S504 is performed. If the target transcoding file does not exist, the media codec is required to be called to transcode the target video to obtain the transcoding file, the transcoding file is sent to the application program, the application program can share the transcoding file, the application program jumps out of the program, and the process is ended.
S504: and determining that the target transcoding file exists, sending the target transcoding file to the application program, and sharing the target transcoding file.
If the target transcoding file exists, it is indicated that the transcoding file corresponding to the target video is already stored in the system memory, and the media provider may obtain the target transcoding file from the system cache and send the target transcoding file to the application program, without transcoding the target video.
And sharing the target transcoding file after the application program receives the target transcoding file. For example, referring to fig. 4, after receiving the target transcoding file, the APP installed in the mobile phone 1 shares the target transcoding file with the APP installed in the mobile phone 2.
By adopting the technical scheme provided by the embodiment of the application, after receiving the sharing operation of the target video, if the target video is determined to need transcoding, a first storage path of a target transcoding file corresponding to a video storage path of the target video can be searched from a first corresponding relation, the target video is obtained after renaming an original video, and the first storage path is a storage path of the transcoding file of the original video. The application may then send the target transcoded file stored at the first storage path, i.e. the transcoded file of the original video, to the application. Therefore, when the application program needs to share the target file obtained after renaming, the application program can share the target transcoding file without repeated transcoding of the target video, so that the waste of computing resources of the electronic equipment caused by repeated video transcoding can be prevented.
In order to implement the embodiment shown in fig. 5, the first correspondence relationship stored in the electronic device needs to be managed, and a procedure of managing the first correspondence relationship will be described through steps a to B.
Step A: and after renaming the first video, obtaining a storage path of a transcoding file corresponding to the first video.
Specifically, the storage path of the transcoding file corresponding to the video storage path of the first video may be searched from the first corresponding relationship recorded in the electronic device, and the storage path of the transcoding file corresponding to the first video may be used as the storage path of the transcoding file.
The process of searching the storage path of the transcoded file is similar to the process of searching the first storage path in step S502, and will not be described herein.
In addition, if the storage path of the transcoded file corresponding to the video storage path of the first video is not found from the recorded first correspondence in the electronic device, the storage path of the transcoded file corresponding to the first video may be generated.
In the embodiment of the application, the transcoded files obtained after transcoding different videos are stored in the same folder, the storage path of the transcoded files consists of the storage path of the folder where the transcoded files are located and the file names of the transcoded files, and other fields except for the file names of the transcoded files in the storage paths of different transcoded files are the same.
Specifically, the file names of the transcoded files can be the video numbers of the videos before transcoding, and the problem of duplicate name conflict among the file names of the transcoded files obtained by transcoding can be avoided because the video numbers of different videos are different. The storage path of the transcoded file for the first video may be generated by combining the storage path of the folder storing the transcoded file with the video number for the first video.
And (B) step (B): and recording the corresponding relation between the video storage path of the renamed video and the obtained storage path as a first corresponding relation.
And (3) each time the video stored in the electronic equipment is renamed, the renamed video can be used as a first video, and the step A-the step B is executed to ensure that the first corresponding relation of video storage paths of the video obtained after each renaming is recorded in the electronic equipment.
In addition, if the pre-transcoding video stored in the electronic device is deleted, the video is not shared any more, and the first correspondence of the video storage path containing the pre-transcoding video recorded in the electronic device may be deleted.
In yet another embodiment of the present invention, the above step S502 may be implemented by the following step C.
Step C: and responding to the sharing operation, and searching a first storage path of a target transcoding file corresponding to the target video based on a first corresponding relation recorded by the electronic equipment under the condition that the target video is determined to be transcoded.
Compared with the foregoing step S502, in the process of responding to the sharing operation, before searching the first storage path, the media provider may determine whether the target video needs to be transcoded, if so, then search the first storage path, return the target transcoded file to the application program, and if not, then directly return the target video to the application program, without executing the subsequent steps, jump out of the program, and end.
Specifically, when determining whether transcoding needs to be performed on the target video, it may be determined whether the application supports opening a video in a second format, where the second format is a format of the target video, for example, the second format may be an HEVC format.
If not, determining that the application program cannot directly open and share the target video, and transcoding the target video is needed. If so, the application program can be directly opened and share the target video.
In addition, in addition to determining whether the application program supports opening the file in the second format, it may also be determined whether a transcoding switch of the media codec is opened, if so, the target video may be transcoded by the media codec when the target video needs to be transcoded, and if the transcoding switch is closed, the target video may not be transcoded by the media encoder.
And, it may also be determined whether the target video is stored in a preset folder, where the media codec only transcodes the video stored in the preset folder, for example, the preset folder may be: a default folder in the electronic device for storing videos shot by itself. If the target video is stored in the preset folder, the target video can be transcoded through the media codec only when the target video needs to be transcoded, and if the target video is not stored in the preset folder, the media codec cannot transcode the target video.
In the process of video sharing using the embodiment shown in fig. 5, there may be a plurality of different videos corresponding to the same transcoded file, for managing the transcoded file, the second correspondence between the storage path of the transcoded file recorded in the electronic device and the reference number of the transcoded file, where the reference number indicates: the number of pre-transcoding videos corresponding to the transcoded file. The number of pre-transcoding videos corresponding to each transcoding file may be determined based on the second correspondence.
Specifically, the second correspondence is stored in a vector array, each item of data in the vector array is in a key-value format, a key is a storage path of a transcoding file, and a value is a reference number of the transcoding file. Here, the correspondence between the storage path of the transcoded file and the reference number of the transcoded file for each item of data record in the vector array is referred to as a second correspondence, and therefore, there is a second correspondence corresponding to each transcoded video, and there are a plurality of second correspondences for the vector array.
And searching the vector array by taking the storage path of the transcoding file as a searching condition, so as to obtain the reference number of the transcoding file.
In the process of using the electronic device, the user may rename the file stored in the electronic device before transcoding, which may cause the reference number of the transcoded file to change, and in order to manage the reference number included in the second correspondence, referring to fig. 6, a flowchart of a first method for managing a transcoded file according to an embodiment of the present invention is shown, where the method includes the following steps S601-S603.
S601: a processing operation for a second video is received.
Specifically, the user may trigger the processing operation of the application program on the target video by means of touch screen, gesture control, voice control, and the like.
S602: if the processing operation indicates to copy the second video, after the second video is copied and renamed to obtain a copy video, a second storage path of the transcoding file corresponding to the second video is obtained, and 1 is added on the basis of the reference number having a second corresponding relation with the second storage path.
Fig. 7A is a schematic diagram of a replicated video application scenario provided in an embodiment of the present application.
As shown in the figure, taking an electronic device as an example of a mobile phone, when a user copies a video in a mobile phone gallery, a video can be selected in a gallery interface of the mobile phone, and the video with the right upper corner in fig. 7A is the video selected by the user. The user then selects the copy button and the selected video can be copied.
It should be noted that the above process of copying video is only some possible implementations illustrated in the embodiments of the present application, and the implementation of the present application is not limited to the process of sharing video specifically.
Specifically, the second video is copied and renamed to obtain a new copy video, which is equivalent to a new video that has newly generated 1 transcoding file corresponding to the second video, and then the number of references of the transcoding file corresponding to the second video is increased by 1.
In one embodiment of the present application, a second storage path of a transcoded file corresponding to a second video may be obtained first, a second correspondence stored in an electronic device and including the second storage path may be searched, and then the number of references included in the second correspondence may be increased by 1.
Specifically, the manner of obtaining the second storage path of the transcoded file corresponding to the second video may be referred to in step D below, which is not described in detail herein.
S603: if the processing operation indicates to delete the second video, after the second video is deleted, acquiring a second storage path of the transcoding file corresponding to the second video, and subtracting 1 on the basis of the reference number having a second correspondence with the second storage path.
Fig. 7B is a schematic diagram of a deleted video application scenario provided in an embodiment of the present application.
As shown in the figure, taking the electronic device as an example of a mobile phone, when a user deletes a video in a mobile phone gallery, a video can be selected in a gallery interface of the mobile phone, and the video with the right upper corner in fig. 7B is the video selected by the user. The user then selects the delete button and the selected video can be deleted.
It should be noted that the above process of deleting video is only some possible implementations illustrated in the embodiments of the present application, and the implementation of the present application is not limited to the process of sharing video specifically.
In one embodiment of the present application, a second storage path of a transcoded file corresponding to a second video may be obtained first, a second correspondence stored in an electronic device and including the second storage path may be searched, and then the number of references included in the second correspondence may be subtracted by 1.
Specifically, the manner of obtaining the second storage path of the transcoded file corresponding to the second video may be referred to in step D below, which is not described in detail herein.
In addition, if the second video is directly renamed, the second video before renaming is deleted, and the renamed video is newly added, the second video before renaming and the renamed video correspond to the same transcoding file, and the reference number of the transcoding file corresponding to the storage address of the transcoding file is reduced by 1 and then added by 1, so that the transcoding file is kept unchanged. Thus, if the second video is renamed directly, no adjustment to the reference number is required.
In an embodiment of the present application, the process of obtaining the second storage path of the transcoded file corresponding to the second video may be implemented in the following step D.
Step D: searching a storage path of a transcoding file corresponding to the video storage path of the second video based on the recorded first corresponding relation, and taking the storage path as a second storage path; .
Specifically, a storage path of the transcoded file corresponding to the video storage path of the second video may be searched from the first corresponding relationship recorded by the electronic device, and the storage path is used as the second storage path.
In another embodiment of the present application, if the first correspondence containing the second video storage path is not found, the second storage path may be generated. And combining the storage path of the folder storing the transcoding file with the video number of the second video to generate a second storage path of the transcoding file corresponding to the second video.
Specifically, the process of obtaining the second storage path is similar to the process shown in the foregoing step a to step B, and will not be described herein.
By adopting the technical scheme provided by the embodiment of the application, the reference number of the transcoding file corresponding to the video can be changed when the video is copied or deleted, so that the reference number of the transcoding file corresponding to the video can be updated every time the video is copied or deleted, and the reference number recorded in each second corresponding relation stored in the electronic equipment is consistent with the actual reference number of the transcoding file.
In addition, referring to fig. 8, a flowchart of a second method for managing transcoded files according to an embodiment of the present invention further includes the following step S604 after the above step S603 compared with the embodiment shown in fig. 6.
S604: and if the adjusted reference number is 0, deleting the transcoding file stored at the storage path corresponding to the reference number based on the second corresponding relation.
If the reference number of the transcoded file after adjustment is 0, it indicates that the transcoded file can be obtained after transcoding no longer exists in the electronic device, and theoretically, the transcoded file can not be opened and shared any more, so that the transcoded file stored in the storage path corresponding to the reference number can be deleted.
By adopting the technical scheme provided by the embodiment of the application, if the adjusted reference number is 0, the transcoding file stored in the storage path corresponding to the reference number can be deleted, so that the storage space of the electronic equipment occupied by the transcoding file is released.
As shown in fig. 9, fig. 9 is a timing diagram of the call of the software inside the electronic device when the APP needs to share the video.
Where APP in fig. 9 is located at the application layer of fig. 2, and other modules in fig. 9 are located at the framework layer of fig. 2.
Referring to fig. 9, the method specifically includes the steps of:
s901: the APP sends a video acquisition request to the content parser.
When the APP receives the sharing operation of the user on the video in the gallery, the APP needs to acquire the video first, so that the APP can send a video acquisition request to the content analyzer through the openTypedASsetFile function.
S902: the content parser forwards the video acquisition request to the media provider.
Wherein the content parser may forward the video acquisition request to the media provider through an openTypedAssetFile function.
S903: the media provider looks up the video file.
S904: the media provider invokes a transcoding helper to determine whether the video needs to be transcoded.
The media provider may trigger a transcoding helper via a shouldTranscode function to determine if the video needs to be transcoded.
The transcoding helper may determine whether transcoding of the video is required through S905-S907.
S905: the transcoding helper determines whether the handset allows transcoding.
Specifically, the following two judgment conditions are adopted:
sdklevel.
"persistent sys. Fuse. Code_enable" and "code_enable", i.e. the transcoding switch in the handset is enabled.
On the basis of meeting the two judging conditions, the mobile phone can be determined to allow transcoding, the following steps S906-S907 are continuously executed, and whether the video is needed to be transcoded or not is continuously judged; if any one of the conditions is not met, determining that transcoding is not allowed by the mobile phone, and further determining that transcoding of the video is not needed.
S906: the transcoding helper determines whether the video is in a storage path that supports transcoding.
The storage path supporting transcoding is "DCIM/camera/, if the video is stored in the path, it is determined that transcoding is supported on the video, the subsequent step S907 is continuously executed, whether transcoding is needed on the video is continuously determined, if not, it is determined that transcoding is not supported on the video, and then it is determined that transcoding is not needed on the video.
S907: the transcoding helper determines whether the video needs to be transcoded.
The transcoding assistant may obtain application media functions (applicatipmediacarabilities) of the APP from which the video formats supported by the APP are obtained. And, the transcoding helper can also acquire the coding format of the video requested to be acquired by the APP.
If the video coding format is the video format supported by the APP, the video does not need to be transcoded; if the video is encoded in a format that is not supported by the APP, the video need not be transcoded.
Illustratively, if the video format supported by the APP and the coding format of the video are both HEVC or AVC, the video need not be transcoded.
If the video format supported by the APP is AVC and the video is encoded in HEVC, the video needs to be transcoded.
S908: after the media provider determines that transcoding of the video is required, the video is opened from the user space file system.
The media provider may open the video from the user space file system via the openWithFuse function.
S909: the user space daemon triggers the media provider to transcode the video.
The userspace daemon triggers the media provider to transcode the video in the userspace file system through a transformforFuse function.
S910: the media provider determines whether a transcoded file for the video exists.
In this embodiment, whether the transcoded file exists may be determined through steps S502 to S503 shown in the foregoing, which is not described herein.
If a transcoded file for the video exists, the media provider may send the transcoded file to the content parser, which sends the transcoded file to the APP, which may then share the video.
If the transcoded file of the video does not exist, S911 is performed.
S911: the media provider invokes the transcoding helper to transcode the video.
S912: the transcoding helper sends a queue request to the media transcoding manager to transcode the video.
S913: the media transcoding manager initiates (start) a video track transcoder to transcode the video.
In a specific implementation, the application further provides a computer storage medium, where the computer storage medium may store a program, where when the program runs, the program controls a device where the computer readable storage medium is located to execute some or all of the steps in the foregoing embodiments. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
In a specific implementation, the embodiment of the application further provides a computer program product, where the computer program product contains executable instructions, and when the executable instructions are executed on a computer, the executable instructions cause the computer to perform some or all of the steps in the above method embodiments.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the present application may be implemented as a computer program or program code that is executed on a programmable system including at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a digital signal processor (Digital Signal Processor, DSP), microcontroller, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope to any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, compact disk Read-Only memories (Compact Disc Read Only Memory, CD-ROMs), magneto-optical disks, read-Only memories (ROMs), random Access Memories (RAMs), erasable programmable Read-Only memories (Erasable Programmable Read Only Memory, EPROMs), electrically erasable programmable Read-Only memories (Electrically Erasable Programmable Read Only Memory, EEPROMs), magnetic or optical cards, flash Memory, or tangible machine-readable Memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) in an electrical, optical, acoustical or other form of propagated signal using the internet. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the drawings of the specification. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module is a logic unit/module, and in physical aspect, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is the key to solve the technical problem posed by the present application. Furthermore, to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems presented by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.
Claims (8)
1. A video sharing method, applied to an electronic device, comprising:
receiving a sharing operation for a target video, wherein the sharing operation is used for: indicating to share the target video through a designated application program, wherein the target video is obtained by renaming an original video, and the target video is: directly renaming the original video to obtain a video, or copying the original video, and renaming the copied video to obtain a copy video;
responding to the sharing operation, and based on a first corresponding relation recorded by the electronic equipment, searching a first storage path of a target transcoding file corresponding to a target video according to a video storage path of the target video, wherein the first corresponding relation is as follows: the corresponding relation between a video storage path of a video and a storage path of a transcoding file corresponding to the video is that: a storage path of the transcoded video corresponding to the original video;
accessing the first storage path, and determining whether the target transcoding file exists;
determining that the target transcoding file exists, sending the target transcoding file to the application program, and sharing the target transcoding file;
Receiving a processing operation for a second video;
if the processing operation indicates to copy the second video, after the second video is copied and renamed to obtain a copy video, obtaining a second storage path of a transcoding file corresponding to the second video, and adding 1 on the basis of a reference number having a second correspondence with the second storage path, where a second correspondence between the storage path of the transcoding file and the reference number of the transcoding file is recorded in the electronic device, and the reference number represents: the number of videos before transcoding corresponding to the transcoding file;
and if the processing operation indicates to delete the second video, after the second video is deleted, acquiring a second storage path of the transcoding file corresponding to the second video, and subtracting 1 on the basis of the reference number with a second corresponding relation with the second storage path.
2. The method according to claim 1, wherein the method further comprises:
after renaming the first video, obtaining a storage path of a transcoding file corresponding to the first video;
and recording the corresponding relation between the video storage path of the renamed video and the obtained storage path as a first corresponding relation.
3. The method of claim 1, wherein responding to the sharing operation, based on the first correspondence recorded by the electronic device, searches a first storage path of a target transcoding file corresponding to a target video according to a video storage path of the target video, and comprises:
and responding to the sharing operation, and searching a first storage path of a target transcoding file corresponding to the target video based on a first corresponding relation recorded by the electronic equipment under the condition that the target video is determined to be transcoded.
4. The method of claim 1, wherein after the obtaining the second storage path of the transcoded file corresponding to the second video is decremented by 1 based on the number of references corresponding to the second storage path, the method further comprises:
and if the adjusted reference number is 0, deleting the transcoding file stored at the storage path corresponding to the reference number based on the second corresponding relation.
5. The method of claim 1, wherein the obtaining the second storage path of the transcoded file corresponding to the second video includes:
And searching a storage path of the transcoding file corresponding to the video storage path of the second video based on the recorded first corresponding relation, and taking the storage path as a second storage path.
6. The method of any of claims 1-5, wherein the second correspondence is stored in a vector array, wherein each item of data in the vector array is in a key-value-to-key format, key is a storage path of a transcoded file, and value is a reference number of the transcoded file.
7. An electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the steps of any of claims 1-6.
8. A computer readable storage medium, characterized in that the computer readable storage medium comprises a stored program, wherein the program when run controls a device in which the computer readable storage medium is located to perform the method according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111592731.6A CN115002537B (en) | 2021-12-23 | 2021-12-23 | Video sharing method, electronic device, storage medium and program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111592731.6A CN115002537B (en) | 2021-12-23 | 2021-12-23 | Video sharing method, electronic device, storage medium and program product |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115002537A CN115002537A (en) | 2022-09-02 |
CN115002537B true CN115002537B (en) | 2023-06-16 |
Family
ID=83018419
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111592731.6A Active CN115002537B (en) | 2021-12-23 | 2021-12-23 | Video sharing method, electronic device, storage medium and program product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115002537B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101236525A (en) * | 2008-01-24 | 2008-08-06 | 创新科存储技术(深圳)有限公司 | File memory, reading, deleting and copying method and its relevant system |
CN103186535A (en) * | 2011-12-27 | 2013-07-03 | 腾讯科技(深圳)有限公司 | Mobile terminal picture management method and equipment |
CN103430511A (en) * | 2011-03-21 | 2013-12-04 | 汤姆逊许可公司 | Replicating data |
CN104641369A (en) * | 2012-12-17 | 2015-05-20 | 株式会社日立制作所 | File server, information system, and control method thereof |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1999023560A1 (en) * | 1997-11-04 | 1999-05-14 | Collaboration Properties, Inc. | Scalable networked multimedia system and applications |
CN101977218B (en) * | 2010-10-20 | 2013-11-27 | 深圳市融创天下科技股份有限公司 | Internet playing file transcoding method and system |
CN103209355A (en) * | 2012-01-16 | 2013-07-17 | 腾讯科技(深圳)有限公司 | Multimedia transcoding method and system |
US9858288B2 (en) * | 2012-08-03 | 2018-01-02 | Egnyte, Inc. | System and method for event-based synchronization of remote and local file systems |
US9686597B2 (en) * | 2012-10-12 | 2017-06-20 | Theplatform, Llc | Adaptive publishing for content distribution |
US8527645B1 (en) * | 2012-10-15 | 2013-09-03 | Limelight Networks, Inc. | Distributing transcoding tasks across a dynamic set of resources using a queue responsive to restriction-inclusive queries |
CN104885432A (en) * | 2013-08-29 | 2015-09-02 | 宇龙计算机通信科技(深圳)有限公司 | Server and file sharing method |
CN106170105A (en) * | 2016-03-31 | 2016-11-30 | 中山大学 | A kind of Multi net voting marketing video service device and method |
CN106557555A (en) * | 2016-10-31 | 2017-04-05 | 努比亚技术有限公司 | media playing method and device |
CN107766505A (en) * | 2017-10-20 | 2018-03-06 | 维沃移动通信有限公司 | A kind of file management method and terminal |
BR112020026449A2 (en) * | 2018-07-12 | 2021-03-23 | Huawei Technologies Co., Ltd. | DATA RESTORATION METHOD, TERMINAL, COMPUTER READable STORAGE MEDIA AND COMPUTER PROGRAM PRODUCT |
CN110113663A (en) * | 2019-05-10 | 2019-08-09 | 深圳市网心科技有限公司 | A kind of audio-video code-transferring method, system, storage medium and distributed apparatus |
CN112988663A (en) * | 2021-03-11 | 2021-06-18 | 维沃移动通信有限公司 | File storage method and electronic equipment |
-
2021
- 2021-12-23 CN CN202111592731.6A patent/CN115002537B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101236525A (en) * | 2008-01-24 | 2008-08-06 | 创新科存储技术(深圳)有限公司 | File memory, reading, deleting and copying method and its relevant system |
CN103430511A (en) * | 2011-03-21 | 2013-12-04 | 汤姆逊许可公司 | Replicating data |
CN103186535A (en) * | 2011-12-27 | 2013-07-03 | 腾讯科技(深圳)有限公司 | Mobile terminal picture management method and equipment |
CN104641369A (en) * | 2012-12-17 | 2015-05-20 | 株式会社日立制作所 | File server, information system, and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN115002537A (en) | 2022-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113542839B (en) | Screen projection method of electronic equipment and electronic equipment | |
KR102558615B1 (en) | Method and electronic device for presenting a video on an electronic device when there is an incoming call | |
CN113329047B (en) | Distributed service scheduling method and related device | |
CN111143586B (en) | Picture processing method and related device | |
CN113254409B (en) | File sharing method, system and related equipment | |
US12032938B2 (en) | Plug-in installation method, apparatus, and storage medium | |
CN115705315A (en) | Method of managing files, electronic device, and computer-readable storage medium | |
CN115145518A (en) | Display method, electronic equipment and system | |
CN115002537B (en) | Video sharing method, electronic device, storage medium and program product | |
CN115019410B (en) | Sending method and electronic equipment | |
CN113721836A (en) | Data deduplication method and device | |
CN114567871A (en) | File sharing method and device, electronic equipment and readable storage medium | |
CN115002543B (en) | Video sharing method, electronic device and storage medium | |
CN113691948B (en) | Communication method and device | |
CN115460445B (en) | Screen projection method of electronic equipment and electronic equipment | |
CN117009023B (en) | Method for displaying notification information and related device | |
WO2024159925A1 (en) | Screen mirroring method, screen mirroring system, and electronic device | |
CN118259825A (en) | Storage space management method and device, electronic equipment and storage medium | |
CN115686339A (en) | Cross-process information processing method, electronic device, storage medium, and program product | |
CN115695445A (en) | Data synchronization method, terminal and system | |
CN118524109A (en) | Data synchronization method, electronic equipment and medium | |
CN114168115A (en) | Communication system, application downloading method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |