CN113259771B - Video playing method, device, system, electronic equipment and storage medium - Google Patents

Video playing method, device, system, electronic equipment and storage medium Download PDF

Info

Publication number
CN113259771B
CN113259771B CN202010089365.1A CN202010089365A CN113259771B CN 113259771 B CN113259771 B CN 113259771B CN 202010089365 A CN202010089365 A CN 202010089365A CN 113259771 B CN113259771 B CN 113259771B
Authority
CN
China
Prior art keywords
video
server
address
transcoded
browser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010089365.1A
Other languages
Chinese (zh)
Other versions
CN113259771A (en
Inventor
辛柏成
曾凡平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010089365.1A priority Critical patent/CN113259771B/en
Publication of CN113259771A publication Critical patent/CN113259771A/en
Application granted granted Critical
Publication of CN113259771B publication Critical patent/CN113259771B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The disclosure relates to a video playing method and device, electronic equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: detecting that the browser does not support a first video type of the first video; sending a transcoding instruction to a server; and receiving and playing the transcoded first video sent by the server. By the method and the device, when the browser is used for playing videos in the network, videos in more different formats can be played on the electronic equipment, the power consumption of the system can be reduced, and the video watching experience of a user can be improved.

Description

Video playing method, device, system, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a video playing method, apparatus, system, electronic device, and storage medium.
Background
In the process of using the mobile terminal, a user often plays videos in a network in a browser, wherein the videos can be live videos or recorded videos. For power saving, when the mobile terminal plays video in the browser, the video is generally decoded and decapsulated by the hardware of the mobile terminal.
In the course of implementing the present disclosure, the inventors found that the prior art has at least the following problems:
the video browsed in the browser is of five-door type, and various video types may be adopted, and may be video types which are not supported by a hardware decoding function of the mobile terminal, and the video types are video types which are not supported by the browser, and corresponding videos cannot be normally played in the browser.
Disclosure of Invention
The present disclosure provides a video playing method, device, system, electronic device and storage medium, so as to at least solve the problem in the related art that a video cannot be played normally due to the fact that the video cannot be decoded and unpacked. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a video playing method applied to a terminal, including:
detecting that a first video type of a first video is not supported by a browser, wherein the unsupported condition comprises at least one of inability to decode and inability to decapsulate;
sending a transcoding instruction to a server, wherein the transcoding instruction carries an address of the first video and a video type supported by the browser, so that the server acquires the first video based on the address of the first video, transcodes the first video from the first video type to a second video type, and sends the transcoded first video to the terminal, wherein the video type supported by the browser comprises the second video type;
and receiving and playing the transcoded first video sent by the server.
Optionally, the detecting that the browser does not support the first video type of the first video includes:
acquiring a first video type of a first video;
and determining that the video types supported by the browser do not comprise the first video type according to the video types supported by the browser.
Optionally, the detecting that the browser does not support the first video type of the first video includes:
a browser failure to play a first video of a first video type is detected.
Optionally, after the transcoding instruction is sent to the server, the method further includes:
receiving a corresponding relation between a playing time point and a data position of a key frame in the first video sent by the server;
after receiving and playing the transcoded first video sent by the server, the method further includes:
when a playing time adjusting instruction is received, acquiring a target playing time point to be adjusted, and determining a target data position corresponding to the target playing time point according to the corresponding relation;
sending the target data position to the server, so that the server acquires a second video of which the first video is positioned behind the target data position, transcoding the second video from the first video type to the second video type, and sending the transcoded second video to the terminal;
and receiving and playing the transcoded second video sent by the server.
In a second aspect, a video playing method is provided, which is applied to a server, and includes:
receiving a transcoding instruction sent by a terminal, wherein the transcoding instruction is sent by the terminal when detecting that a browser does not support a first video type of a first video, and the unsupported condition comprises at least one of decoding failure and decapsulation failure, and the transcoding instruction carries an address of the first video and a video type supported by the browser;
acquiring the first video based on the address of the first video;
transcoding the first video from the first video type to a second video type, wherein the video types supported by the browser include the second video type;
and sending the transcoded first video to the terminal.
Optionally, after receiving the transcoding instruction sent by the terminal, the method further includes:
acquiring a corresponding relation between a playing time point and a data position of a key frame in the first video based on the address of the first video, and sending the corresponding relation to the terminal;
when a target data position sent by the terminal is received, acquiring a second video of the first video behind the target data position;
transcoding the second video from the first video type to the second video type;
and sending the transcoded second video to the terminal.
In a third aspect, a video playing apparatus is provided, including:
a detecting unit, configured to detect that a browser does not support a first video type of a first video, where the unsupported case includes at least one of inability to decode and inability to decapsulate;
a sending unit, configured to send a transcoding instruction to a server, where the transcoding instruction carries an address of the first video and a video type supported by the browser, so that the server obtains the first video based on the address of the first video, transcodes the first video from the first video type to a second video type, and sends the transcoded first video to the terminal, where the video type supported by the browser includes the second video type;
and the receiving unit is used for receiving and playing the transcoded first video sent by the server.
Optionally, the detection unit is configured to:
acquiring a first video type of a first video;
and determining that the video types supported by the browser do not comprise the first video type according to the video types supported by the browser.
Optionally, the detection unit is configured to:
a failure of the browser to play a first video of the first video type is detected.
Optionally, the sending unit is further configured to:
receiving a corresponding relation between a playing time point and a data position of a key frame in the first video sent by the server;
the receiving unit is further configured to:
when a playing time adjusting instruction is received, acquiring a target playing time point to be adjusted, and determining a target data position corresponding to the target playing time point according to the corresponding relation;
sending the target data position to the server, so that the server acquires a second video of which the first video is positioned behind the target data position, transcodes the second video from the first video type to the second video type, and sends the transcoded second video to the terminal;
and receiving and playing the transcoded second video sent by the server.
In a fourth aspect, a video playing apparatus is provided, which includes:
the transcoding instruction is sent by the terminal when the terminal detects that a browser does not support a first video type of a first video, and the unsupported condition comprises at least one of decoding failure and decapsulation failure, wherein the transcoding instruction carries an address of the first video and the video type supported by the browser;
an acquisition unit configured to acquire the first video based on an address of the first video;
the transcoding unit is used for transcoding the first video from the first video type to a second video type, wherein the video types supported by the browser comprise the second video type;
and the sending unit is used for sending the transcoded first video to the terminal.
Optionally, the receiving unit is further configured to:
acquiring a corresponding relation between a playing time point and a data position of a key frame in the first video based on the address of the first video, and sending the corresponding relation to the terminal;
when a target data position sent by the terminal is received, acquiring a second video of the first video behind the target data position;
transcoding the second video from the first video type to the second video type;
and sending the transcoded second video to the terminal.
In a fifth aspect, a system for playing a video is provided, where the system includes a terminal and a server, where:
the terminal detects that a browser does not support a first video type of a first video, wherein the unsupported condition comprises at least one of decoding failure and decapsulation failure; sending a transcoding instruction to the server, wherein the transcoding instruction carries the address of the first video and the video type supported by the browser; and receiving and playing the transcoded first video sent by the server.
The server receives a transcoding instruction sent by the terminal; acquiring the first video based on the address of the first video; transcoding the first video from the first video type to a second video type, wherein the video types supported by the browser include the second video type; and sending the transcoded first video to the terminal.
In a sixth aspect, an electronic device is provided, comprising:
the electronic device comprises a processor and a memory, wherein at least one instruction is stored in the memory, and the instruction is loaded and executed by the processor to realize the operation executed by the video playing method.
In a seventh aspect, a storage medium is provided, comprising:
the storage medium has at least one instruction stored therein, and the instruction is loaded and executed by the processor to implement the operations performed by the video playing method as described above.
In an eighth aspect, there is provided a computer program product comprising:
the computer program product stores at least one instruction that is loaded and executed by a processor to implement the operations performed by the video playback method as described above.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
the video type which is not supported by the browser is transcoded into the video type which is supported by the browser, so that when the browser is used for playing the video in a network, more types of videos can be played on the electronic equipment on the basis of ensuring power saving.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
FIG. 1 is a schematic diagram of an implementation environment of a video playback method according to an exemplary embodiment;
FIG. 2 is a flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 3 is a flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 4 is an interaction flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 5 is an interaction flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 6 is an interaction flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 7 is an interaction flow diagram illustrating a method of video playback in accordance with an exemplary embodiment;
FIG. 8 is a schematic diagram illustrating a structure of a video playback device in accordance with an exemplary embodiment;
FIG. 9 is a schematic diagram of a video playback device according to an exemplary embodiment;
FIG. 10 is a block diagram illustrating a terminal in accordance with an exemplary embodiment;
fig. 11 is a schematic diagram illustrating a configuration of a server according to an example embodiment.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure more apparent, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings.
FIG. 1 is a schematic diagram of an implementation environment provided by embodiments of the present disclosure. Referring to fig. 1, the video playing method provided by the present disclosure may be implemented by a terminal and a server together. The terminal can operate an application program with a video playing function, such as a browser, a video playing application program and the like, can be provided with a screen, a loudspeaker and other components, has a communication function, can be accessed to the Internet, and can be a mobile phone, a tablet computer, intelligent wearable equipment and the like. The server may become a transit server, which may be a backend server for the application, and the transit server may establish communication with the terminal. The server may be a relay server, the relay server may be a single server or a server group, if the server is a single server, the server may be responsible for all processing required by the server in the following scheme, if the server group is a server group, different servers in the server group may be respectively responsible for different processing in the following scheme, and specific processing allocation conditions may be arbitrarily set by a technical person according to actual requirements, and are not described herein again.
The video playing method provided by the embodiment of the disclosure can transcode the video type which cannot be played originally into the video type which can be played. In the embodiment of the present disclosure, the detailed description of the scheme is performed by taking the video playing of the browser as an example, the corresponding application program is the browser, and other situations are similar to the above and are not described again. The browser application has a video playing function. The browser application program is provided with an interface for hardware decoding and decapsulating the video data, and the interface acquires the transcoded video data by accessing the website, and decodes and decapsulates the transcoded video data to obtain the video data which can be played by the browser.
When using the browser application program, a user firstly enters a website for watching videos, clicks a link of a certain video and enters a playing interface of the video. The browser accesses the address of the video storage, loads the video and plays the video on a playing interface, controls such as pause, play, fast forward, fast backward, volume and full screen are displayed on the playing interface of the video, and a user can control the play of the video by clicking the controls. When the pause control is triggered, the browser stops playing the video; when the playing control is triggered, the browser starts to play the video; when the fast forward and fast backward control is triggered, the browser plays the fast forward and fast backward video; when the volume control is triggered, the browser adjusts the volume according to the operation of the user; and when the full-screen control is triggered, the browser plays the video in a full screen mode.
Fig. 2 is a flowchart of a terminal side in a method for playing a video according to an embodiment of the present disclosure. Referring to fig. 2, the process includes:
step 201, it is detected that the browser does not support a first video type of the first video, wherein the unsupported condition includes at least one of inability to decode and inability to decapsulate.
Step 202, a transcoding instruction is sent to a server, wherein the transcoding instruction carries an address of a first video and a video type supported by a browser, so that the server obtains the first video based on the address of the first video, transcodes the first video from the first video type to a second video type, and sends the transcoded first video to a terminal, and the video type supported by the browser comprises the second video type.
And step 203, receiving and playing the transcoded first video sent by the server.
Fig. 3 is a flowchart of a server side in a method for playing a video according to an embodiment of the present disclosure. Referring to fig. 3, the process includes:
step 301, receiving a transcoding instruction sent by a terminal, where the transcoding instruction is sent by the terminal when detecting that the browser does not support the first video type of the first video, the unsupported case includes at least one of inability to decode and inability to decapsulate, and the transcoding instruction carries an address of the first video and a video type supported by the browser.
Step 302, acquiring a first video based on the address of the first video.
Step 303, transcoding the first video from the first video type to a second video type, wherein the video types supported by the browser include the second video type.
And step 304, sending the transcoded first video to the terminal.
Fig. 4 is a flowchart illustrating interaction between a server and a terminal in a video playing method according to an embodiment of the present disclosure.
Referring to fig. 4, the process includes:
step 401, the terminal detects that the browser does not support the first video type of the first video. Two possible solutions are given below:
the method comprises the steps of obtaining a first video type of a first video, and determining that the video type supported by the browser does not comprise the first video type according to the video type supported by the browser.
In implementation, a user clicks a skip link of a first video to enter a video playing interface, a terminal can load the first video and obtain metadata of the first video, the metadata is analyzed to obtain a video format of the first video, the video format of the first video can also be obtained by accessing a resource server stored in the first video, the video type can be obtained by checking the video type which can be decoded by a hardware decoding function and stored in the terminal, after the terminal obtains the first video type, the first video type is compared with the video type which can be decoded by the hardware decoding function, and the comparison result shows that the video type which can be decoded by the hardware decoding function does not include the first video type.
And in the second mode, the failure of the browser to play the first video of the first video type is detected.
In implementation, a user clicks a skip link of a first video to enter a video playing interface, a terminal accesses a resource server stored in the first video to load the first video, the first video is unpackaged by a hardware unpacking function after the first video is loaded, the first video is decoded by a hardware decoding function, when the first video is unpacked or decoded, a processing unit of the hardware decoding and unpacking function detects that the video type which can be decoded and unpacked by the hardware decoding and unpacking function does not include the first video type, and the hardware decoding function fails to decode the first video.
Step 402, the terminal sends a transcoding instruction to the server.
In order to reduce the load of the server, the relay server may be a server group consisting of a first server and a second server, the first server is used for performing interaction with other devices and some basic processing functions, the second server is used for performing transcoding of the video, and a data connection is established between the two servers.
In implementation, when detecting that the hardware decoding function and/or the decapsulation function of the terminal cannot decode and/or decapsulate the first video of the first video type, the terminal adds the address of the first video (the address of the first video may be a URL) and the video type supported by the local hardware decoding function and/or the decapsulation function to the transcoding instruction, and sends the transcoding instruction to the transit server.
Step 403, after receiving the transcoding instruction sent by the terminal, the server acquires the first video based on the address of the first video.
Wherein the first video is stored in the resource server and the address of the first video is the address in the resource server.
In implementation, the transit service receives the transcoding instruction sent by the terminal, the first server receives the transcoding instruction and sends the address of the first video to the second server, and the second server accesses the address of the first video and loads the first video of the first video type.
In step 404, the server transcodes the first video from the first video type to the second video type and sends the transcoded first video to the terminal.
The video types supported by the browser comprise the second video type, the second video type is a video type supported by a hardware decoding function and/or a decapsulation function of the terminal, and the loading of the first video and the transcoding of the first video can be performed simultaneously.
In implementation, a first server performs transcoding configuration according to a downloaded first video, that is, a transfer server sorts video types supported by a hardware decoding function and/or a decapsulation function of a terminal according to stored video type priorities, selects a video type with the highest priority as a second video type, the first server sends an address and transcoding configuration to a second server, the second server obtains a video stream of the first video based on the address, transcodes the first video from the first video type to the second video type in the process of receiving the first video, allocates an address of a resource server for the transcoded first video, adds an identifier of the transcoded first video into the address, and sends the address of the resource server added with the identifier to the terminal by the second server. The address may be a URL (Uniform Resource Locator).
And 405, the terminal receives and plays the transcoded first video sent by the server.
The type of the transcoded first video is a second video type, the second video type is a video type supported by a local hardware decoding function and/or a decapsulation function, and the resource server may be a content distribution network.
In implementation, after receiving an address sent by a first server, as shown in fig. 5, a terminal sends an HTTP (HyperText Transfer Protocol) request to a content distribution network based on the address, that is, the terminal accesses the address, after receiving the HTTP request, the content distribution network extracts an identifier in the address, where the identifier may correspond to an output address of a first video of a transcoded first video in a second server, and sends the identifier in the request for loading data to the first server, where the first server sends the request for loading data with the identifier to the second server through a data connection with the second server, and the second server obtains the transcoded first video according to the output address of the first video corresponding to the identifier access identifier. And after the acquisition is completed, the second server sends the transcoded first video to the address of the content distribution network. And the terminal is always in a state of accessing the address, and when the address receives the transcoded first video, the terminal also loads the first video and decodes and plays the first video.
After receiving the fast forward and fast backward commands, an interaction flow between the transit server and the terminal may be as shown in fig. 6, where the flow includes:
step 601, the server obtains the corresponding relation between the playing time point and the data position of the key frame in the first video based on the address of the first video, and sends the corresponding relation to the terminal.
The playing time point in the first video is the playing time point of each key frame in the first video, the data position in the first video is the position of each key frame in the first video in the video data, and the corresponding relation is the corresponding relation between the playing time point of each key frame and the position of each key frame in the video data.
In implementation, the first server accesses the address of the first video, may obtain metadata of the first video and analyze the metadata to obtain a corresponding relationship between the play time point and the data position of the key frame in the first video, or may directly obtain the corresponding relationship between the play time point and the data position of the key frame in the first video from the address, and send the corresponding relationship to the terminal after the obtaining is completed.
Step 602, the terminal receives a corresponding relationship between a playing time point of a key frame in a first video and a data position sent by the server, acquires a target playing time point to be adjusted when the terminal receives a playing time adjustment instruction, determines a target data bit corresponding to the target playing time point according to the corresponding relationship, and sends the target data bit to the relay server.
The playing time adjusting instruction can be a fast forward instruction and a fast backward instruction.
In implementation, a terminal receives and stores a corresponding relation sent by a first server, a user triggers a fast forward and fast backward mechanism, the terminal receives corresponding fast forward and fast backward instructions, the terminal acquires playing time points of the fast forward and fast backward instructions, finds a playing time point of a key frame closest to the playing time point, takes the playing time point of the closest key frame as the playing time point of the fast forward and fast backward instructions, determines a data position, namely a target data bit, corresponding to the playing time point of the key frame according to the corresponding relation, and sends the obtained target data bit to the first server.
Step 603, when the server receives the target data bit sent by the terminal, acquiring a second video of which the first video is located behind the target data bit, transcoding the second video from the first video type to a second video type, and sending the transcoded second video to the terminal.
In implementation, as shown in fig. 7, the terminal is always in the state of accessing the address of the content distribution network, and after the first server receives the target data bits sent by the terminal, the first server sends the target data bit to the second server according to the data connection between the first server and the second server, the second server obtains a second video after the target data bit according to the target data bit, transcoding the second video after the target data bit to obtain the transcoded video data in a live stream form, transmitting the second video to the address by the second server according to the address of the content distribution network stored by the second server, after the second video is received by the address, as the terminal is always in the state of accessing the address, the transcoded second video is also transmitted to the terminal, and after the terminal receives the second video, the terminal unpacks, decodes and plays the second video.
The video type which is not supported by the browser is transcoded into the video type which is supported by the browser, so that when the browser is used for playing the video in a network, more types of videos can be played on the electronic equipment on the basis of ensuring power saving.
All the above optional technical solutions may be combined arbitrarily to form the optional embodiments of the present disclosure, and are not described herein again.
The disclosed embodiment provides a video playing device, which may be a terminal, as shown in fig. 8, and includes:
a detecting unit 810, configured to detect that a browser does not support a first video type of a first video, where the unsupported case includes at least one of inability to decode and inability to decapsulate;
a sending unit 820, configured to send a transcoding instruction to a server, where the transcoding instruction carries an address of the first video and a video type supported by the browser, so that the server obtains the first video based on the address of the first video, transcodes the first video from the first video type to a second video type, and sends the transcoded first video to the terminal, where the video type supported by the browser includes the second video type;
a receiving unit 830, configured to receive and play the transcoded first video sent by the server.
Optionally, the detecting unit 810 is configured to:
acquiring a first video type of a first video;
and determining that the video types supported by the browser do not comprise the first video type according to the video types supported by the browser.
Optionally, the detecting unit 810 is configured to:
a browser failure to play a first video of a first video type is detected.
Optionally, the sending unit 820 is further configured to:
receiving a corresponding relation between a playing time point and a data position of a key frame in the first video sent by the server;
the receiving unit 830 is further configured to:
when a playing time adjusting instruction is received, acquiring a target playing time point to be adjusted, and determining a target data position corresponding to the target playing time point according to the corresponding relation;
sending the target data position to the server, so that the server acquires a second video of which the first video is positioned behind the target data position, transcodes the second video from the first video type to the second video type, and sends the transcoded second video to the terminal;
and receiving and playing the transcoded second video sent by the server.
An embodiment of the present disclosure provides a video playing apparatus, which may be the above server, as shown in fig. 9, including:
a receiving unit 910, configured to receive a transcoding instruction sent by a terminal, where the transcoding instruction is sent by the terminal when detecting that a browser does not support a first video type of a first video, where an unsupported condition includes at least one of inability to decode and inability to decapsulate, and the transcoding instruction carries an address of the first video and a video type supported by the browser;
an obtaining unit 920, configured to obtain the first video based on an address of the first video;
a transcoding unit 930, configured to transcode the first video from the first video type to a second video type, where the video types supported by the browser include the second video type;
a sending unit 940, configured to send the transcoded first video to the terminal.
Optionally, the receiving unit 910 is further configured to:
acquiring a corresponding relation between a playing time point and a data position of a key frame in the first video based on the address of the first video, and sending the corresponding relation to the terminal;
when a target data position sent by the terminal is received, acquiring a second video of the first video behind the target data position;
transcoding the second video from the first video type to the second video type;
and sending the transcoded second video to the terminal.
The embodiment of the present disclosure provides a video playing system, which includes a terminal and a server, wherein:
the terminal detects that the browser does not support a first video type of a first video, wherein the unsupported condition comprises at least one of decoding failure and decapsulation failure; sending a transcoding instruction to the server, wherein the transcoding instruction carries the address of the first video and the video type supported by the browser; and receiving and playing the transcoded first video sent by the server.
The server receives a transcoding instruction sent by the terminal; acquiring the first video based on the address of the first video; transcoding the first video from the first video type to a second video type, wherein the video types supported by the browser include the second video type; and sending the transcoded first video to the terminal.
The video type which is not supported by the browser is transcoded into the video type which is supported by the browser, so that when the browser is used for playing the video in a network, more types of videos can be played on the electronic equipment on the basis of ensuring power saving.
Fig. 10 shows a block diagram of a terminal 1000 according to an exemplary embodiment of the disclosure. The terminal 1000 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer or a desktop computer. Terminal 1000 can also be referred to as user equipment, portable terminal, laptop terminal, desktop terminal, or the like by other names.
In general, terminal 1000 can include: a processor 1001 and a memory 1002.
Processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 1001 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1001 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1001 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 1001 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. The memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1002 is used to store at least one instruction for execution by the processor 1001 to implement the video playback method provided by the method embodiments of the present disclosure.
In some embodiments, terminal 1000 can also optionally include: a peripheral interface 1003 and at least one peripheral. The processor 1001, memory 1002 and peripheral interface 1003 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1003 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1004, touch screen display 1005, camera 1006, audio circuitry 1007, positioning components 1008, and power supply 1009.
The peripheral interface 1003 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1001 and the memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1001, the memory 1002, and the peripheral interface 1003 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 1004 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1004 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1004 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1004 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1004 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1004 may also include NFC (Near Field Communication) related circuits, which are not limited by this disclosure.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1005 is a touch display screen, the display screen 1005 also has the ability to capture touch signals on or over the surface of the display screen 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this point, the display screen 1005 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display screen 1005 can be one, providing a front panel of terminal 1000; in other embodiments, display 1005 can be at least two, respectively disposed on different surfaces of terminal 1000 or in a folded design; in still other embodiments, display 1005 can be a flexible display disposed on a curved surface or on a folded surface of terminal 1000. Even more, the display screen 1005 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display screen 1005 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
The camera assembly 1006 is used to capture images or video. Optionally, the camera assembly 1006 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1006 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1007 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 1001 for processing or inputting the electric signals into the radio frequency circuit 1004 for realizing voice communication. For stereo sound collection or noise reduction purposes, multiple microphones can be provided, each at a different location of terminal 1000. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1001 or the radio frequency circuit 1004 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuit 1007 may also include a headphone jack.
A Location component 1008 is employed to locate a current geographic Location of terminal 1000 for purposes of navigation or LBS (Location Based Service). The Positioning component 1008 may be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 1009 is used to supply power to various components in terminal 1000. The power source 1009 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1009 includes a rechargeable battery, the rechargeable battery may support wired charging or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1000 can also include one or more sensors 1010. The one or more sensors 1010 include, but are not limited to: acceleration sensor 1011, gyro sensor 1012, pressure sensor 1013, fingerprint sensor 1014, optical sensor 1015, and proximity sensor 1016.
Acceleration sensor 1011 can detect acceleration magnitudes on three coordinate axes of a coordinate system established with terminal 1000. For example, the acceleration sensor 1011 can be used to detect the components of the gravitational acceleration on three coordinate axes. The processor 1001 may control the touch display screen 1005 to display a user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1011. The acceleration sensor 1011 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1012 may detect a body direction and a rotation angle of the terminal 1000, and the gyro sensor 1012 and the acceleration sensor 1011 may cooperate to acquire a 3D motion of the user on the terminal 1000. From the data collected by the gyro sensor 1012, the processor 1001 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1013 may be disposed on a side frame of terminal 1000 and/or on a lower layer of touch display 1005. When the pressure sensor 1013 is disposed on a side frame of the terminal 1000, a user's grip signal of the terminal 1000 can be detected, and left-right hand recognition or shortcut operation can be performed by the processor 1001 according to the grip signal collected by the pressure sensor 1013. When the pressure sensor 1013 is disposed at a lower layer of the touch display screen 1005, the processor 1001 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 1005. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1014 is used to collect a fingerprint of the user, and the processor 1001 identifies the user according to the fingerprint collected by the fingerprint sensor 1014, or the fingerprint sensor 1014 identifies the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 1001 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. Fingerprint sensor 1014 can be disposed on the front, back, or side of terminal 1000. When a physical key or vendor Logo is provided on terminal 1000, fingerprint sensor 1014 can be integrated with the physical key or vendor Logo.
The optical sensor 1015 is used to collect the ambient light intensity. In one embodiment, the processor 1001 may control the display brightness of the touch display screen 1005 according to the intensity of the ambient light collected by the optical sensor 1015. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1005 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1005 is turned down. In another embodiment, the processor 1001 may also dynamically adjust the shooting parameters of the camera assembly 1006 according to the intensity of the ambient light collected by the optical sensor 1015.
Proximity sensor 1016, also known as a distance sensor, is typically disposed on a front panel of terminal 1000. Proximity sensor 1016 is used to gather the distance between the user and the front face of terminal 1000. In one embodiment, when proximity sensor 1016 detects that the distance between the user and the front surface of terminal 1000 gradually decreases, processor 1001 controls touch display 1005 to switch from a bright screen state to a dark screen state; when proximity sensor 1016 detects that the distance between the user and the front surface of terminal 1000 is gradually increased, touch display screen 1005 is controlled by processor 1001 to switch from a breath screen state to a bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 10 is not limiting of terminal 1000 and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
Fig. 11 is a schematic structural diagram of a server according to an embodiment of the present disclosure, where the server may be a server in the foregoing embodiments, the server may be a transit server, the transit server may be a server group, the server group may include a first server and a second server, the server 1000 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1101 and one or more memories 1102, where the memory 1102 stores at least one instruction, and the at least one instruction is loaded and executed by the processors 1101 to implement the methods provided by the foregoing method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
In an exemplary embodiment, a computer-readable storage medium, such as a memory, including instructions executable by a processor in a terminal to perform the video playing method in the above embodiments is also provided. For example, the computer-readable storage medium may be a Read-only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is meant to be illustrative of the preferred embodiments of the present disclosure and not to be taken as limiting the disclosure, and any modifications, equivalents, improvements and the like that are within the spirit and scope of the present disclosure are intended to be included therein.

Claims (12)

1. A video playing method is applied to a terminal, and the method comprises the following steps:
detecting that a first video type of a first video is not supported by a browser, wherein the unsupported condition comprises at least one of inability to decode and inability to decapsulate;
sending a transcoding instruction to a server, wherein the transcoding instruction carries the address of the first video and the video type supported by the browser, the server comprises a first server, a second server and a resource server, so that the first server determines a second video type with the highest priority among the video types supported by the browser, transmits the address and the second video type to the second server, so that the second server acquires the first video based on the address of the first video, transcodes the first video from the first video type to the second video type, allocates an address containing an identifier to the transcoded first video, and sending the address containing the identifier to the terminal, wherein the address is the address of the resource server, the identification corresponds to an output address of the transcoded first video in the second server;
receiving the address containing the identifier, sending a request to the address containing the identifier so that the resource server receives the request, resolving the identifier from the address containing the identifier, sending a loading request carrying the identifier to the second server through the first server so that the second server receives the loading request, accessing an output address corresponding to the identifier, acquiring the transcoded first video, sending the transcoded first video to the resource server, and sending the transcoded first video to the terminal by the resource server;
and receiving and playing the transcoded first video sent by the resource server.
2. The method of claim 1, wherein detecting that the browser does not support the first video type of the first video comprises:
acquiring a first video type of a first video;
and determining that the video types supported by the browser do not comprise the first video type according to the video types supported by the browser.
3. The method of claim 1, wherein detecting that the browser does not support the first video type of the first video comprises:
a browser failure to play a first video of a first video type is detected.
4. The method of claim 1, wherein after sending the transcoding instruction to the server, the method further comprises:
receiving a corresponding relation between a playing time point and a data position of a key frame in the first video sent by the server;
after receiving and playing the transcoded first video sent by the resource server, the method further includes:
when a playing time adjusting instruction is received, acquiring a target playing time point to be adjusted, and determining a target data position corresponding to the target playing time point according to the corresponding relation;
sending the target data position to the server, so that the server acquires a second video of which the first video is positioned behind the target data position, transcodes the second video from the first video type to the second video type, and sends the transcoded second video to the terminal;
and receiving and playing the transcoded second video sent by the server.
5. A video playback method, the method comprising:
the method comprises the steps that a first server receives a transcoding instruction sent by a terminal, wherein the transcoding instruction is sent by the terminal when the terminal detects that a browser does not support a first video type of a first video, the unsupported condition comprises at least one of decoding failure and decapsulation failure, and the transcoding instruction carries an address of the first video and the video type supported by the browser;
the first server determines a second video type with the highest priority in the video types supported by the browser, and sends the address and the second video type to a second server;
the second server acquires the first video based on the address of the first video;
the second server transcodes the first video from the first video type into a second video type, allocates an address containing an identifier for the transcoded first video, and sends the address containing the identifier to the terminal, wherein the address is the address of a resource server, and the identifier corresponds to an output address of the transcoded first video in the second server, so that the terminal sends a request to the address based on the address containing the identifier;
the resource server receives the request, resolves the identifier from the address containing the identifier, and sends a loading request carrying the identifier to the second server through the first server;
the second server receives the loading request, accesses the output address corresponding to the identifier, acquires the transcoded first video, and sends the transcoded first video to the resource server;
and the resource server receives the transcoded first video and sends the transcoded first video to the terminal.
6. The method of claim 5, wherein after the first server receives the transcoding instruction sent by the terminal, the method further comprises:
the first server acquires a corresponding relation between a playing time point and a data position of a key frame in the first video based on the address of the first video, and sends the corresponding relation to the terminal;
when receiving a target data position sent by the terminal, the first server sends the target data position to the second server;
the second server acquires a second video of which the first video is positioned behind the target data, transcodes the second video from the first video type to the second video type, and sends the transcoded second video to the resource server;
and the resource server sends the transcoded second video to the terminal.
7. A video playback apparatus, the apparatus comprising:
the detection unit is used for detecting that the browser does not support a first video type of the first video, wherein the unsupported condition comprises at least one of decoding failure and decapsulation failure;
a sending unit, configured to send a transcoding instruction to a server, where the transcoding instruction carries an address of the first video and a video type supported by the browser, and the server includes a first server, a second server, and a resource server, so that the first server determines a second video type with the highest priority among the video types supported by the browser, transmits the address and the second video type to the second server, so that the second server acquires the first video based on the address of the first video, transcodes the first video from the first video type to the second video type, allocates an address containing an identifier to the transcoded first video, and sending the address containing the identification to the device, the address being the address of the resource server, the identification corresponds to an output address of the transcoded first video in the second server;
a receiving unit, configured to receive the address including the identifier, send a request to the address including the identifier, so that the resource server receives the request, resolve the identifier from the address including the identifier, send, through the first server, a load request carrying the identifier to the second server, so that the second server receives the load request, access an output address corresponding to the identifier, obtain the transcoded first video, send the transcoded first video to the resource server, and send the transcoded first video to the device by the resource server;
the receiving unit is further configured to receive and play the transcoded first video sent by the resource server.
8. The apparatus of claim 7, wherein the detection unit is configured to:
acquiring a first video type of a first video;
and determining that the video types supported by the browser do not comprise the first video type according to the video types supported by the browser.
9. The apparatus of claim 7, wherein the detection unit is configured to:
a browser failure to play a first video of a first video type is detected.
10. The apparatus of claim 7, wherein the sending unit is further configured to:
receiving a corresponding relation between a playing time point and a data position of a key frame in the first video sent by the server;
the receiving unit is further configured to:
when a playing time adjusting instruction is received, acquiring a target playing time point to be adjusted, and determining a target data position corresponding to the target playing time point according to the corresponding relation;
sending the target data position to the server, so that the server acquires a second video of which the first video is positioned behind the target data position, transcodes the second video from the first video type to the second video type, and sends the transcoded second video to the device;
and receiving and playing the transcoded second video sent by the server.
11. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the video playback method of any of claims 1 to 4.
12. A storage medium in which instructions, when executed by a processor of an electronic device, enable the electronic device to perform a video playback method as claimed in any one of claims 1 to 4.
CN202010089365.1A 2020-02-12 2020-02-12 Video playing method, device, system, electronic equipment and storage medium Active CN113259771B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010089365.1A CN113259771B (en) 2020-02-12 2020-02-12 Video playing method, device, system, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010089365.1A CN113259771B (en) 2020-02-12 2020-02-12 Video playing method, device, system, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113259771A CN113259771A (en) 2021-08-13
CN113259771B true CN113259771B (en) 2022-08-26

Family

ID=77219756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010089365.1A Active CN113259771B (en) 2020-02-12 2020-02-12 Video playing method, device, system, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113259771B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102882829A (en) * 2011-07-11 2013-01-16 腾讯科技(深圳)有限公司 Transcoding method and system
CN103036888A (en) * 2012-12-19 2013-04-10 南京视海网络科技有限公司 Self-adapting stream-media play method and self-adapting play unit
CN103379381A (en) * 2012-04-17 2013-10-30 中兴通讯股份有限公司 Video playing method based on WAP gateway, WAP gateway, and system
CN105100824A (en) * 2015-09-10 2015-11-25 东方网力科技股份有限公司 Video processing equipment, system and method
CN105657442A (en) * 2015-12-30 2016-06-08 北京奇艺世纪科技有限公司 Video file generation method and system
CN110430451A (en) * 2019-08-20 2019-11-08 北京豆萌信息技术有限公司 Video broadcasting method, player, server and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190069006A1 (en) * 2017-08-29 2019-02-28 Western Digital Technologies, Inc. Seeking in live-transcoded videos

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102882829A (en) * 2011-07-11 2013-01-16 腾讯科技(深圳)有限公司 Transcoding method and system
CN103379381A (en) * 2012-04-17 2013-10-30 中兴通讯股份有限公司 Video playing method based on WAP gateway, WAP gateway, and system
CN103036888A (en) * 2012-12-19 2013-04-10 南京视海网络科技有限公司 Self-adapting stream-media play method and self-adapting play unit
CN105100824A (en) * 2015-09-10 2015-11-25 东方网力科技股份有限公司 Video processing equipment, system and method
CN105657442A (en) * 2015-12-30 2016-06-08 北京奇艺世纪科技有限公司 Video file generation method and system
CN110430451A (en) * 2019-08-20 2019-11-08 北京豆萌信息技术有限公司 Video broadcasting method, player, server and system

Also Published As

Publication number Publication date
CN113259771A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
CN110674022B (en) Behavior data acquisition method and device and storage medium
CN108833963B (en) Method, computer device, readable storage medium and system for displaying interface picture
CN109246123B (en) Media stream acquisition method and device
CN109327608B (en) Song sharing method, terminal, server and system
CN110248236B (en) Video playing method, device, terminal and storage medium
CN109144346B (en) Song sharing method and device and storage medium
CN111327694B (en) File uploading method and device, storage medium and electronic equipment
CN111741366A (en) Audio playing method, device, terminal and storage medium
CN111880888B (en) Preview cover generation method and device, electronic equipment and storage medium
CN110196673B (en) Picture interaction method, device, terminal and storage medium
CN109982129B (en) Short video playing control method and device and storage medium
CN111092991B (en) Lyric display method and device and computer storage medium
CN111010588B (en) Live broadcast processing method and device, storage medium and equipment
CN111818358A (en) Audio file playing method and device, terminal and storage medium
CN111083554A (en) Method and device for displaying live gift
CN111294551B (en) Method, device and equipment for audio and video transmission and storage medium
CN111008083B (en) Page communication method and device, electronic equipment and storage medium
CN109714628B (en) Method, device, equipment, storage medium and system for playing audio and video
CN112616082A (en) Video preview method, device, terminal and storage medium
CN111241451A (en) Webpage processing method and device, computer equipment and storage medium
CN112992127A (en) Voice recognition method and device
CN111464829B (en) Method, device and equipment for switching media data and storage medium
CN112492331B (en) Live broadcast method, device, system and storage medium
CN113259771B (en) Video playing method, device, system, electronic equipment and storage medium
CN110996115B (en) Live video playing method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant