CN110909204A - Video publishing method and device and electronic equipment - Google Patents

Video publishing method and device and electronic equipment Download PDF

Info

Publication number
CN110909204A
CN110909204A CN201811088449.2A CN201811088449A CN110909204A CN 110909204 A CN110909204 A CN 110909204A CN 201811088449 A CN201811088449 A CN 201811088449A CN 110909204 A CN110909204 A CN 110909204A
Authority
CN
China
Prior art keywords
video
target
frame
feature
target video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811088449.2A
Other languages
Chinese (zh)
Inventor
邱健龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Ucweb Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ucweb Inc filed Critical Ucweb Inc
Priority to CN201811088449.2A priority Critical patent/CN110909204A/en
Publication of CN110909204A publication Critical patent/CN110909204A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a video publishing method, a video publishing device and electronic equipment. The method comprises the following steps: acquiring video characteristic information of a target video to be published; generating a preview segment of the target video according to the video characteristic information; and displaying the preview segment when the target video is released. According to the method and the device, the preview segment displayed during video release does not need to be extracted manually, and the labor cost and the time cost consumed by video release are greatly saved.

Description

Video publishing method and device and electronic equipment
Technical Field
The present invention relates to the field of video technologies, and in particular, to a video publishing method, a video publishing device, and an electronic device.
Background
At present, when an internet media platform or a social platform publishes videos uploaded by users, a static cover map of each published video is usually displayed in a video publishing interface so that users with video watching demands can know the content of the corresponding video through the static cover map and select videos meeting the demands to play and watch.
However, it is difficult to actually reflect the real content of the video only through the static cover map of the video, so that it is difficult for a user who watches the video to quickly and accurately select a video that meets the own requirements for watching, which affects the video watching experience.
Disclosure of Invention
It is an object of the present invention to provide a new technical solution for distributing video.
According to a first aspect of the present invention, there is provided a video distribution method, including:
acquiring video characteristic information of a target video to be published;
wherein the video feature information comprises at least a feature frame indication for indicating a feature frame of the target video;
generating a preview segment of the target video according to the video characteristic information;
and displaying the preview segment when the target video is released.
Optionally, the step of obtaining the video feature information of the target video includes:
providing a video feature configuration interface for receiving configuration operation on feature frames of the target video;
and acquiring a feature frame indication corresponding to the configuration operation according to the configuration operation received by the video feature configuration interface.
Optionally, the step of obtaining the video feature information of the target video includes:
acquiring a historical play record of the target video;
according to the historical playing record, obtaining the playing heat index of each video frame included in the target video;
wherein the playing popularity index is an index for representing the playing popularity of the video frame;
and selecting the video frame with the playing heat index meeting a preset heat condition as the characteristic frame, and determining the corresponding characteristic frame indication.
Optionally, the step of obtaining the video feature information of the target video includes:
acquiring user characteristics of a target user;
wherein the target user is a publishing object of the target video, and the user characteristics comprise at least video preference indications of the user;
respectively acquiring content characteristics of each video frame included in the target video, so as to acquire the characteristic correlation degree of each video frame according to the content characteristics and the user characteristics;
and selecting the video frame with the characteristic correlation degree meeting a preset correlation degree condition as the characteristic frame, and determining the corresponding characteristic frame indication.
Optionally, the step of obtaining the user characteristics of the target user includes:
acquiring video preference information of the target user according to the video playing record, the video searching record and the video purchasing record of the target user;
generating the video preference indication according to the video preference information of the target user;
and/or the presence of a gas in the gas,
the video preference indication at least comprises one of a video type, actor information and a scene type;
the content characteristics at least comprise one of content types, actor information and scene types;
the relevancy condition is that the feature relevancy of the video frame is highest or higher than a preset relevancy threshold.
Optionally, the step of generating a preview segment of the target video according to the video feature information includes:
determining a target playing position of the characteristic frame in the target video according to the characteristic frame indication;
taking the target playing position as a central position, intercepting a video frame which accords with a preset video duration in the target video, and generating a corresponding preview segment;
and/or the presence of a gas in the gas,
the preview segment is a video segment or an image file conforming to a preset image format.
Optionally, the step of setting that the preview segment is played when the target video is published includes:
providing a video publishing interface for publishing the target video;
and directly displaying the preview segment through the video publishing interface so as to publish the target video.
Optionally, the method further comprises:
and when the playing operation of the target video is received through the video publishing interface, ending the display of the preview segment and switching to play the target video.
According to a second aspect of the present invention, there is provided a video playback apparatus, comprising:
the device comprises a characteristic acquisition unit, a processing unit and a processing unit, wherein the characteristic acquisition unit is used for acquiring video characteristic information of a target video to be issued;
wherein the video feature information comprises at least a feature frame indication for indicating a feature frame of the target video;
the preview generating unit is used for generating a preview segment of the target video according to the video characteristic information;
and the preview display unit is used for displaying the preview segment when the target video is issued.
According to a third aspect of the present invention, there is provided an electronic apparatus, comprising:
the display device is used for displaying a human-computer interaction interface;
a memory for storing executable instructions;
and the processor is used for operating the electronic equipment according to the control of the executable instruction so as to execute the video distribution method provided by the first aspect of the invention.
According to one embodiment of the disclosure, through acquiring video characteristic information of a target video to be published, a preview segment of the target video is generated according to the video characteristic information, the preview segment is published when the target video is published, any video to be published can be published, the preview segment displayed when the video is published does not need to be extracted manually and automatically, labor cost and time cost consumed by video publishing are greatly saved, and video publishing is performed through the preview segment which displays the video characteristic information of the target video, so that a user can quickly and intuitively know real content of the published target video, quickly and accurately select videos meeting self requirements to watch, and user experience is improved.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a block diagram showing an example of a hardware configuration of an electronic apparatus 1000 that can be used to implement an embodiment of the present invention.
Fig. 2 shows a flow chart of a video distribution method of an embodiment of the present invention.
Fig. 3 shows a flowchart of the steps of acquiring video characteristic information of a target video according to an embodiment of the present invention.
Fig. 4 shows a block diagram of the video distribution apparatus 3000 of the embodiment of the present invention.
Fig. 5 shows a block diagram of an electronic device 4000 of an embodiment of the invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
< hardware configuration >
Fig. 1 is a block diagram showing a hardware configuration of an electronic apparatus 1000 that can implement an embodiment of the present invention.
The electronic device 1000 may be a laptop, desktop, cell phone, tablet, etc. As shown in fig. 1, the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like. The processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 1200 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, a USB interface, a headphone interface, and the like. The communication device 1400 is capable of wired or wireless communication, for example, and may specifically include Wifi communication, bluetooth communication, 2G/3G/4G/5G communication, and the like. The display device 1500 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 1600 may include, for example, a touch screen, a keyboard, a somatosensory input, and the like. A user can input/output voice information through the speaker 1700 and the microphone 1800.
The electronic device shown in fig. 1 is merely illustrative and is in no way meant to limit the invention, its application, or uses. In an embodiment of the present invention, the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to execute any one of the video distribution methods provided by the embodiment of the present invention. It will be appreciated by those skilled in the art that although a plurality of means are shown for the electronic device 1000 in fig. 1, the present invention may relate to only some of the means therein, e.g. the electronic device 1000 relates to only the processor 1100 and the storage means 1200. The skilled person can design the instructions according to the disclosed solution. How the instructions control the operation of the processor is well known in the art and will not be described in detail herein.
< example >
The general concept of the embodiment of the invention is to provide a new technical scheme for releasing videos, which is characterized in that video characteristic information of a target video to be released is obtained, a preview segment of the target video is generated according to the video characteristic information, the preview segment is released when the target video is released, any video to be released can be automatically released, the preview segment displayed when the video is released is not required to be manually extracted, labor cost and time cost required by video release are greatly saved, and the user can quickly and intuitively know the real content of the released target video by displaying the preview segment reflecting the video characteristic information of the target video for video release, quickly and accurately select videos meeting self requirements for watching, and improve user experience.
< method >
In the present embodiment, a video distribution method is provided, as shown in fig. 2, including steps S2100-S2300.
Step S2100, obtaining video characteristic information of a target video to be released.
The video characteristic information is related information representing the characteristics of video content, and the specific information content can be set by actual application scenes or application requirements.
In this embodiment, the video feature information includes at least a feature frame indication indicating a feature frame of the target video. The characteristic frame indication may be a unique frame identifier of the characteristic frame, or a position indication of the characteristic frame in the target video, which may be a playing time of the characteristic frame, or the like. The video feature information may also include video type, video playing heat, and the like.
In one example, the step of acquiring video feature information of a target video to be published may include: steps S2111-S2112.
Step S2111, providing a video feature configuration interface for receiving a configuration operation on a feature frame of the target video.
The video feature configuration interface is a human-computer interaction interface for providing corresponding video feature configuration services, and can be used for receiving configuration operations of feature frames of a target video, which are implemented by a user.
In this embodiment, the configuration operation may be an operation performed by the user, such as clicking, checking, inputting, and the like. For example, the configuration operation may be a selection operation of clicking and checking a candidate feature frame displayed on a video feature configuration interface by a user, where the candidate feature frame is a video frame extracted from a target video and meeting a preset candidate condition, and the candidate condition may be set according to an actual application scene or an application requirement, for example, the candidate condition is that a unique frame identifier of the video frame meets a recommended frame set provided by a video provider; alternatively, the configuration operation may be an input operation of a unique frame identification of the feature frame by a user or the like, for example, a unique frame identification of the feature frame which may be specified by a direct input video provider or a video producer.
Step S2112, according to the configuration operation received by the video feature configuration interface, obtaining the feature frame indication corresponding to the configuration operation.
After receiving the configuration operation implemented by the user, the video feature configuration interface can determine the video feature frame of the target video configured by the user and acquire the feature frame indication corresponding to the configuration operation.
In this example, by providing a video feature configuration interface, a corresponding feature frame instruction can be obtained according to a received configuration operation, and a corresponding preview segment embodying features of a target video is automatically generated in combination with subsequent steps, so that a personalized video feature extraction requirement is met.
In another example, the step of obtaining video characteristic information of the target video may include: steps S2121-S2123.
Step S2121, obtaining a history playing record of the target video.
In this example, a historical play record of the target video may be obtained through a platform providing for publishing and playing the target video, where the historical play record may include the total number of times the video is played and a video play record of playing the target video each time, the video play record at least includes a play duration of the video, and may further include a start time of playing the video, a play duration of playing the video, a pause time of fast forwarding during playing, and the like, and the video play record may be obtained after playing the target video each time, and the obtained historical play record may be updated correspondingly.
Step S2122, acquiring the playing heat index of each video frame included in the target video according to the historical playing record.
The play heat index is an index for representing the play heat of the video frame, and the specific parameter content may be set according to a specific application scene or an application requirement, for example, the play heat indication may be the play frequency of the video frame, and may be a heat value calculated according to a weighting factor set according to a content type of the video frame and the play frequency of the video frame.
Step S2123, selecting the video frames with the playing heat indexes meeting the preset heat conditions as characteristic frames, and determining corresponding characteristic frame instructions.
In this example, the heat condition is a condition for judging whether the video frame meets the feature frame according to the heat index of the video frame, and may be set according to a specific application scene or an application requirement. For example, the heat condition is that the playing heat index of the video frame is greater than a preset heat threshold or the playing heat index of the video is maximum, where the heat threshold may be set according to a specific application scene or application requirement.
In the embodiment, the playing heat index of each video frame of the target video is obtained according to the historical playing record of the target video, and the video frame of which the playing heat index meets the heat condition is extracted as the characteristic frame, so that the preview segment with higher playing heat is automatically generated in combination with the subsequent steps, the characteristics of the target video can be more accurately embodied in the preview segment, a user can more intuitively know the content of the target video through the preview segment, and the user experience is improved. In addition, in the embodiment, the feature frames can be updated and extracted in real time according to the change of the playing heat index of the video frames of the target video, and the feature of the target video can be more accurately represented by the preview segment issued each time by the target video in combination with the subsequent steps, so that the video issuing efficiency is improved.
In another example, the step of obtaining video feature information of the target video, as shown in fig. 3, includes: steps S2131-S2133.
Step S2131, obtains a user characteristic of the target user.
In this example, the target user is a publishing object of the target video, for example, the target video is published through a client of a certain browser, and the target user is a user who registers the client or a user who uses the client. In this example, the target user may be a single user or a group of users.
The user characteristics are related characteristic parameters which embody that the target user watches the video. In this example, the user characteristics include at least a video preference indication of the user. The video preference indication is information indicating a video preference of the user, for example, the video preference indication includes at least one of a video type, actor information, and scene type. The video type is a type of video, and the type can be divided according to a preset type division rule, for example, the video type can be divided according to video content, including characters, ages, histories, comprehension, geography, and the like, and also can be divided according to the viewing effect of the video, including drama, fun, sadness, ease, and the like, which are not listed herein. The actor information is information of actors who show the video and may include actor types, actor names, and the like. The scene type is a scene type in which the content of the video is embodied, such as a landscape, a person, and the like.
In this example, the step of obtaining the user characteristics of the target user may include: steps S21311-S21312.
Step S21311, obtaining video preference information of the target user according to the video playing record, the video searching record, and the video purchasing record of the target user.
The video playing record, the video searching record and the video purchasing record of the user can be obtained through a platform used by the target user and used for releasing and playing the video. The video playing record comprises the type of the video played by the user, the time length of the video played, the frequency of the video played, the content parameter of the video played and the like. The video search record comprises search keywords of a user search video, a search result conversion rate (the ratio of the number of clicked items in the search result to the number of overall items in the search result), and the like. The video purchase record comprises the types of videos purchased by the user, the playing time length of each purchased video, the playing times, the content parameters of the videos and the like.
According to the video playing record, the video searching record and the video purchasing record of the target user, common clustering statistics and other methods such as K-Means can be carried out, the preference video which accords with the preference of the target user is obtained by statistical clustering, the video content of the preference video is analyzed, and the video preference information of the target user can be obtained, such as the video type, actor information, scene type and the like of the preference of the user.
Step S21312 is to generate a video preference indication according to the video preference information of the target user.
After the user characteristics are obtained, entering:
step S2132, respectively obtaining content characteristics of each video frame included in the target video, so as to obtain a characteristic correlation degree of each video frame according to the content characteristics and the user characteristics.
The content features are related parameters reflecting the content characteristics of the video frames. In this example, the content characteristics include at least one of a content type, actor information, and scene type. The content type is the type of the video content displayed by the video frame, and may be divided according to the embodied emotional content, for example, it includes laughter, relaxed, sad, normal, etc., or may be divided according to the picture content, for example, it is bright, cloudy, etc. The actor information is information of actors appearing in the video frame and may include actor types, actor names, and the like. The scene type may be, for example, landscape, people, etc.
In this example, the image recognition analysis method may be used to recognize the picture displayed by each video frame of the target video, and obtain the content features of the video frames, or the provider of the target video marks the content features of the video frames, and uploads the content features through the provided interface and then obtains the content features.
The feature correlation degree is a parameter representing the degree of correlation between the content features of the video frames and the user features. In this example, a feature vector formed by a content frame of a video frame and a feature vector formed by a user feature may be used to obtain a vector distance between two feature vectors as a feature correlation degree by using a method such as manhattan distance or chebyshev distance.
Step S2133, selecting a video frame with a feature correlation degree meeting a preset correlation degree condition as a feature frame, and determining a corresponding feature frame indicator.
The relevancy condition is a condition for judging whether the video frame meets the feature frame according to the feature relevancy, and can be set according to an actual application scene or an application requirement. For example, the correlation condition is that the feature correlation of the video frame is the highest or the feature correlation is higher than a preset correlation threshold.
In this example, by acquiring the user characteristics of the target user who is the release object of the target video, and extracting the video frame in which the characteristic correlation degree between the content characteristics and the user characteristics meets the preset correlation degree condition as the characteristic frame in the target video, the preview segment conforming to the video preference of the release object of the target video can be automatically generated and displayed when the video is released in combination with the subsequent steps, so that the target video is released in a personalized manner for the target user, the target user is effectively attracted, the conversion rate of the target video from release to play is improved, and the release efficiency of the target video is improved.
After step S2100, the flow proceeds to:
step S2200 is that the preview segment of the target video is generated according to the video characteristic information.
In this embodiment, video frames in the target video, which can embody the characteristics of the target video, may be captured according to the video characteristic information, and a corresponding preview segment is generated.
For example, the target video feature information includes a feature frame indication of a feature frame of the target video, and the step of generating the preview segment of the target video according to the video feature information may include: steps S2210-S2220.
Step S2210, determining a target playing position of the characteristic frame in the target video according to the characteristic frame indication.
In this example, the feature frame indication may be based on a unique frame identifier of the feature frame or a position indication of the feature frame in the target video, where the position indication may be a playing time of the feature frame or the like. According to the feature frame indication, a target playing position of the feature frame in the target video can be determined, and the target playing position can be a playing sequence or a playing time of the feature frame in the target video.
Step S2220, with the target playing position as the center position, capturing the video frame which accords with the preset video duration from the target video, and generating the corresponding preview segment.
The preset video duration may be set according to an actual application scene or application requirements, for example, set to 5 seconds. And (3) assuming that the preset video time is 5 seconds, intercepting video frames in 2.5 seconds before and after the target playing position in the target video, and generating a corresponding preview segment.
In this example, by implementing the electronic device of this embodiment, the preview segment may be generated locally and stored locally for reading the corresponding preview segment for display when the target video is published, or the feature frame indication may be sent to a background server connected to the electronic device, and the background server is triggered to generate the preview segment according to the feature frame indication, and then return the preview segment to the electronic device for storage locally or store in a cloud storage connected to the electronic device, so as to read the corresponding preview segment for display when the target video is published.
In this embodiment, the preview segment may be a video segment or an image file conforming to a preset image format. The preset image format can be a GIF format, and the dynamic image effect can be displayed through the image file in the GIF format. The preview segment is displayed by the video segment or the image file in the preset image format, so that the user can more intuitively and quickly know the content of the target video through the preview segment.
And step S2300, displaying the preview segment when the target video is released.
When the target video is released, the preview segment which embodies the characteristics of the target video is directly displayed, so that a user can quickly and intuitively know the content of the target video, and the video which accords with the user can be more conveniently selected to be played and watched.
In this embodiment, step S2300 may include: steps S2310-S2320.
Step S2310, a video publishing interface is provided for publishing the target video.
The video publishing interface is a man-machine interaction interface for providing video publishing services, and can be provided by an information publishing platform for providing services such as video publishing, playing and the like. For example, the video publishing interface may be an information publishing page provided by a browser, and when a user clicks and refreshes the information publishing page, a public number of a certain short video is opened, and the browser interface publishes a newly published video of the public number.
Step S2310, the preview segment is directly displayed through the video publishing interface to implement publishing the target video.
For example, the video publishing interface is an information publishing page, and when a user clicks and opens a public number of a certain short video, once the page of the public number is refreshed and opened, a preview segment of the video published on the public number can be automatically played on the page without further operation of the user.
It should be understood that the manner in which the preview segment is presented may vary depending on the particular form of the preview segment. For example, when the preview segment is a video segment, the preview segment may be displayed by calling a video player to play the video segment, and when the preview segment is an image file, the preview segment may be displayed by calling a corresponding image file player.
In this example, the video distribution method provided in this embodiment further includes:
and when the playing operation of the target video is received through the video publishing interface, ending the display of the preview segment and switching to play the target video.
The playing operation can be performed by a user through clicking, sliding and the like on the video publishing interface. When the playing operation is received, the displaying of the preview segment is finished, the playing target video is directly switched, and the video playing experience of 'selecting and playing once' of a user can be provided.
< video distribution apparatus >
In this example, there is also provided a video distribution apparatus 3000, as shown in fig. 4, including: the feature obtaining unit 3100, the preview generating unit 3200, and the preview presenting unit 3300 are configured to implement the video publishing method provided in this embodiment, and are not described herein again.
The video distribution apparatus 3000 includes:
a feature obtaining unit 3100, configured to obtain video feature information of a target video to be published;
wherein the video feature information comprises at least a feature frame indication for indicating a feature frame of the target video;
a preview generating unit 3200, configured to generate a preview segment of the target video according to the video feature information;
and the preview display unit 3300 is configured to display the preview segment when the target video is published.
In one example, the feature acquisition unit 3100 includes:
means for providing a video feature configuration interface for receiving a configuration operation on a feature frame of the target video;
and the device is used for acquiring a feature frame instruction corresponding to the configuration operation according to the configuration operation received by the video feature configuration interface.
In another example, the feature acquisition unit 3100 includes:
means for obtaining a historical play record of the target video;
the device is used for acquiring the playing heat index of each video frame included in the target video according to the historical playing record;
wherein the playing popularity index is an index for representing the playing popularity of the video frame;
and the device is used for selecting the video frame with the playing heat index meeting the preset heat condition as the characteristic frame and determining the corresponding characteristic frame indication.
In another example, the feature acquisition unit 3100 includes:
means for obtaining a user characteristic of a target user;
wherein the target user is a publishing object of the target video, the user characteristics including at least means for a video preference indication of the user;
the system comprises a video acquisition module, a video display module and a video display module, wherein the video acquisition module is used for respectively acquiring content characteristics of each video frame included by the target video so as to acquire characteristic correlation of each video frame according to the content characteristics and the user characteristics;
and the device is used for selecting the video frame with the characteristic correlation degree meeting a preset correlation degree condition as the characteristic frame and determining the corresponding characteristic frame indication.
In this example, the means for obtaining the user characteristics of the target user comprises:
means for obtaining video preference information of the target user based on the video play record, the video search record, and the video purchase record of the target user;
means for generating the video preference indication in accordance with video preference information of the target user;
and/or the presence of a gas in the gas,
the video preference indication at least comprises one of a video type, actor information and a scene type;
the content characteristics at least comprise one of content types, actor information and scene types;
the relevancy condition is that the feature relevancy of the video frame is highest or higher than a preset relevancy threshold.
Optionally, the preview generating unit 3200 includes:
means for determining a target playback position of the feature frame in the target video based on the feature frame indication;
means for capturing a video frame corresponding to a preset video duration from the target video with the target playing position as a center position, and generating a corresponding preview segment;
and/or the presence of a gas in the gas,
the preview segment is a video segment or an image file conforming to a preset image format.
Optionally, the preview presentation unit 3300 includes:
means for providing a video publishing interface for publishing the target video;
and the device is used for directly displaying the preview segment through the video publishing interface so as to publish the target video.
Optionally, the video distribution apparatus 3000 further includes:
and the device is used for ending the display of the preview segment and switching to play the target video when the playing operation of the target video is received through the video publishing interface.
It will be appreciated by those skilled in the art that the video distribution apparatus 3000 can be implemented in various ways. For example, the video distribution apparatus 3000 may be implemented by an instruction configuration processor. For example, the video distribution apparatus 3000 may be implemented by storing instructions in a ROM and reading the instructions from the ROM into a programmable device when starting up the device. For example, the video distribution apparatus 3000 may be cured into a dedicated device (e.g., ASIC). The video distribution apparatus 3000 may be divided into units independent of each other, or may be implemented by combining them together. The video distribution apparatus 3000 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
In the present embodiment, the video distribution apparatus 3000 may be any software application that realizes the video distribution method in the present embodiment, a plug-in or a patch that can be embedded or connected to the software application, a functional module in an operating system, or the like. The video distribution apparatus 3000 may be any electronic device in which any software application or functional module that realizes the video distribution method in the present embodiment is installed.
< electronic apparatus >
In this embodiment, an electronic device 4000 is further provided, as shown in fig. 5, including:
a display device 4100 for displaying a human-computer interaction interface;
a memory 4200 for storing executable instructions;
the processor 4300 is configured to operate the electronic device according to the executable instruction to perform any one of the video distribution methods provided in this embodiment.
In this embodiment, the electronic device 4000 may be any electronic device that can support implementation of the video distribution method according to this embodiment, such as a mobile phone, a handheld computer, a tablet computer, a desktop computer, or a notebook computer. For example, the electronic apparatus 4000 is a mobile phone in which an application that implements the video distribution method of the present embodiment is installed.
In this embodiment, the electronic apparatus 4000 may further include other devices, for example, the electronic apparatus 1000 shown in fig. 1.
The embodiments of the present invention have been described above with reference to the accompanying drawings, and according to the embodiments, a video publishing method, an apparatus, and an electronic device are provided, in which video feature information of a target video to be published is acquired, a preview segment of the target video is generated according to the video feature information, so as to set the preview segment to be published when the target video is published, and any video to be published can be published without manually extracting and automatically acquiring the preview segment displayed when the video is published, so that labor cost and time cost consumed by video publishing are greatly saved, and a user can quickly and intuitively know real content of the published target video by displaying the preview segment representing the video feature information of the target video, quickly and accurately select a video meeting own needs to be watched, and user experience is improved.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. A video distribution method, comprising:
acquiring video characteristic information of a target video to be published;
wherein the video feature information comprises at least a feature frame indication for indicating a feature frame of the target video;
generating a preview segment of the target video according to the video characteristic information;
and displaying the preview segment when the target video is released.
2. The method of claim 1, wherein the step of obtaining video feature information of the target video comprises:
providing a video feature configuration interface for receiving configuration operation on feature frames of the target video;
and acquiring a feature frame indication corresponding to the configuration operation according to the configuration operation received by the video feature configuration interface.
3. The method of claim 1, wherein the step of obtaining video feature information of the target video comprises:
acquiring a historical play record of the target video;
according to the historical playing record, obtaining the playing heat index of each video frame included in the target video;
wherein the playing popularity index is an index for representing the playing popularity of the video frame;
and selecting the video frame with the playing heat index meeting a preset heat condition as the characteristic frame, and determining the corresponding characteristic frame indication.
4. The method of claim 1, wherein the step of obtaining video feature information of the target video comprises:
acquiring user characteristics of a target user;
wherein the target user is a publishing object of the target video, and the user characteristics comprise at least video preference indications of the user;
respectively acquiring content characteristics of each video frame included in the target video, so as to acquire the characteristic correlation degree of each video frame according to the content characteristics and the user characteristics;
and selecting the video frame with the characteristic correlation degree meeting a preset correlation degree condition as the characteristic frame, and determining the corresponding characteristic frame indication.
5. The method of claim 4, wherein,
the step of obtaining the user characteristics of the target user comprises the following steps:
acquiring video preference information of the target user according to the video playing record, the video searching record and the video purchasing record of the target user;
generating the video preference indication according to the video preference information of the target user;
and/or the presence of a gas in the gas,
the video preference indication at least comprises one of a video type, actor information and a scene type;
the content characteristics at least comprise one of content types, actor information and scene types;
the relevancy condition is that the feature relevancy of the video frame is highest or higher than a preset relevancy threshold.
6. The method of claim 1, wherein the step of generating the preview segment of the target video according to the video feature information comprises:
determining a target playing position of the characteristic frame in the target video according to the characteristic frame indication;
taking the target playing position as a central position, intercepting a video frame which accords with a preset video duration in the target video, and generating a corresponding preview segment;
and/or the presence of a gas in the gas,
the preview segment is a video segment or an image file conforming to a preset image format.
7. The method of claim 1, wherein the setting, while publishing the target video, the step of playing the preview segment comprises:
providing a video publishing interface for publishing the target video;
and directly displaying the preview segment through the video publishing interface so as to publish the target video.
8. The method of claim 7, further comprising:
and when the playing operation of the target video is received through the video publishing interface, ending the display of the preview segment and switching to play the target video.
9. A video playback apparatus, comprising:
the device comprises a characteristic acquisition unit, a processing unit and a processing unit, wherein the characteristic acquisition unit is used for acquiring video characteristic information of a target video to be issued;
wherein the video feature information comprises at least a feature frame indication for indicating a feature frame of the target video;
the preview generating unit is used for generating a preview segment of the target video according to the video characteristic information;
and the preview display unit is used for displaying the preview segment when the target video is issued.
10. An electronic device, comprising:
the display device is used for displaying a human-computer interaction interface;
a memory for storing executable instructions;
a processor for operating the electronic device to perform the video distribution method of claims 1-8 under control of the executable instructions.
CN201811088449.2A 2018-09-18 2018-09-18 Video publishing method and device and electronic equipment Pending CN110909204A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811088449.2A CN110909204A (en) 2018-09-18 2018-09-18 Video publishing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811088449.2A CN110909204A (en) 2018-09-18 2018-09-18 Video publishing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN110909204A true CN110909204A (en) 2020-03-24

Family

ID=69813497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811088449.2A Pending CN110909204A (en) 2018-09-18 2018-09-18 Video publishing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110909204A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114630146A (en) * 2022-05-12 2022-06-14 海看网络科技(山东)股份有限公司 Method for solving problem of progress bar preview jamming in IPTV

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100211636A1 (en) * 2006-09-29 2010-08-19 Michael Ross Starkenburg Management of profiles for interactive media guidance applications
CN104731944A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Video searching method and device
CN105721620A (en) * 2016-05-09 2016-06-29 百度在线网络技术(北京)有限公司 Video information push method and device as well as video information display method and device
CN107197381A (en) * 2017-06-12 2017-09-22 深圳Tcl新技术有限公司 Temperature curve generation method, device and the readable storage medium storing program for executing of television video
CN107484019A (en) * 2017-08-03 2017-12-15 乐蜜有限公司 The dissemination method and device of a kind of video file

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100211636A1 (en) * 2006-09-29 2010-08-19 Michael Ross Starkenburg Management of profiles for interactive media guidance applications
CN104731944A (en) * 2015-03-31 2015-06-24 努比亚技术有限公司 Video searching method and device
CN105721620A (en) * 2016-05-09 2016-06-29 百度在线网络技术(北京)有限公司 Video information push method and device as well as video information display method and device
CN107197381A (en) * 2017-06-12 2017-09-22 深圳Tcl新技术有限公司 Temperature curve generation method, device and the readable storage medium storing program for executing of television video
CN107484019A (en) * 2017-08-03 2017-12-15 乐蜜有限公司 The dissemination method and device of a kind of video file

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114630146A (en) * 2022-05-12 2022-06-14 海看网络科技(山东)股份有限公司 Method for solving problem of progress bar preview jamming in IPTV
CN114630146B (en) * 2022-05-12 2023-01-17 海看网络科技(山东)股份有限公司 Method for solving problem of progress bar preview jamming in IPTV

Similar Documents

Publication Publication Date Title
JP6276344B2 (en) Method and system for extracting and providing highlight video of video content
CN108932253B (en) Multimedia search result display method and device
RU2614137C2 (en) Method and apparatus for obtaining information
US8913171B2 (en) Methods and systems for dynamically presenting enhanced content during a presentation of a media content instance
RU2640632C2 (en) Method and device for delivery of information
CN108153848B (en) Method and device for searching light application data and electronic device
US10440435B1 (en) Performing searches while viewing video content
CN103686344A (en) Enhanced video system and method
US10674183B2 (en) System and method for perspective switching during video access
WO2022247220A9 (en) Interface processing method and apparatus
CN108959320A (en) The method and apparatus of preview video search result
CN111935551A (en) Video processing method and device, electronic equipment and storage medium
CN114025188B (en) Live advertisement display method, system, device, terminal and readable storage medium
CN107515870B (en) Searching method and device and searching device
CN116821475B (en) Video recommendation method and device based on client data and computer equipment
US9152707B2 (en) System and method for creating and providing media objects in a navigable environment
CN112464031A (en) Interaction method, interaction device, electronic equipment and storage medium
CN115190366B (en) Information display method, device, electronic equipment and computer readable medium
US20230209125A1 (en) Method for displaying information and computer device
CN110020106B (en) Recommendation method, recommendation device and device for recommendation
CN113157972B (en) Recommendation method and device for video cover document, electronic equipment and storage medium
CN110753246A (en) Video playing method, client, server and system
CN110909204A (en) Video publishing method and device and electronic equipment
CN108255917B (en) Image management method and device and electronic device
CN117786159A (en) Text material acquisition method, apparatus, device, medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20200527

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 100192 A706, 7 / F, block a, floor B-6, Dongsheng Science Park, Zhongguancun, No.66, xixiaokou Road, Haidian District, Beijing

Applicant before: UC MOBILE Co.,Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20200324

RJ01 Rejection of invention patent application after publication