CN110858913A - Multimedia content processing method and device - Google Patents

Multimedia content processing method and device Download PDF

Info

Publication number
CN110858913A
CN110858913A CN201810966752.1A CN201810966752A CN110858913A CN 110858913 A CN110858913 A CN 110858913A CN 201810966752 A CN201810966752 A CN 201810966752A CN 110858913 A CN110858913 A CN 110858913A
Authority
CN
China
Prior art keywords
multimedia content
tone
category
determining
hue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810966752.1A
Other languages
Chinese (zh)
Inventor
蒋庆明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Beijing Youku Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youku Technology Co Ltd filed Critical Beijing Youku Technology Co Ltd
Priority to CN201810966752.1A priority Critical patent/CN110858913A/en
Publication of CN110858913A publication Critical patent/CN110858913A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/232Content retrieval operation locally within server, e.g. reading video streams from disk arrays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/239Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests
    • H04N21/2393Interfacing the upstream path of the transmission network, e.g. prioritizing client content requests involving handling client requests
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/432Content retrieval operation from a local storage medium, e.g. hard-disk
    • H04N21/4325Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4532Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4826End-user interface for program selection using recommendation lists, e.g. of programs or channels sorted out according to their score

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to a method and an apparatus for processing multimedia content, including: receiving a multimedia content searching request sent by a terminal, wherein the multimedia content searching request comprises a first tone category; determining multimedia content according to the first tone category, wherein the second tone category to which the multimedia content belongs is matched with the first tone category; and sending the multimedia content to the terminal. According to the method and the device for processing the multimedia content, provided by the embodiment of the disclosure, the search mode of the multimedia content can be enriched.

Description

Multimedia content processing method and device
Technical Field
The present disclosure relates to the field of multimedia technologies, and in particular, to a method and an apparatus for processing multimedia content.
Background
With the development of multimedia technology, users can view multimedia contents on terminals.
Because the number of the multimedia contents is huge, a user can search the contents which the user wants to see from the large number of multimedia contents through the keywords; or the user can search for the content to be watched from the selected classification by classifying according to country, year and multimedia content category; or the terminal can perform personalized recommendation for the user, and the user can search the content which the user wants to see from the content recommended by the terminal.
Disclosure of Invention
In view of this, the present disclosure provides a method and an apparatus for processing multimedia content to enrich search manners of the multimedia content.
According to a first aspect of the present disclosure, there is provided a method for processing multimedia content, applied to a server, the method including:
receiving a multimedia content searching request sent by a terminal, wherein the multimedia content searching request comprises a first tone category;
determining multimedia content according to the first tone category, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
and sending the multimedia content to the terminal.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
In one possible implementation, the method further includes:
obtaining a plurality of sample images from multimedia content;
determining a hue parameter of any of the sample images;
determining a tone parameter of the multimedia content according to the tone parameters of the plurality of sample images;
and determining a second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
In one possible implementation, the obtaining a plurality of sample images from multimedia content includes:
a plurality of frame images are acquired from the multimedia content at specified intervals as the plurality of sample images.
In one possible implementation, the determining the color tone parameter of any sample image includes:
determining a plurality of sample pixel points from the sample image;
determining a hue parameter of any sample pixel point;
and determining the tone parameters of the sample image according to the tone parameters of the plurality of sample pixel points.
In a possible implementation manner, the determining, according to the tone parameter of the multimedia content, the second tone category to which the multimedia content belongs includes:
determining a tone parameter difference of a tone parameter of the multimedia content corresponding to any second tone category;
and when the color tone parameter difference corresponding to the multimedia content and any second color tone category is within a threshold value range, determining that the multimedia content belongs to the second color tone category.
In one possible implementation, the hue parameters include lightness and/or saturation.
According to a second aspect of the present disclosure, there is provided a method for processing multimedia content, applied to a terminal, the method including:
in response to a selection operation for a first tone category, generating a multimedia content lookup request including the first tone category;
sending the multimedia content searching request to a server;
receiving multimedia content sent by the server, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
displaying a multimedia content list including the multimedia content.
In one possible implementation, in response to a selection operation for a first tone category, generating a multimedia content search request including the first tone category includes:
responding to a selection operation aiming at colors in a color option area, and determining a color corresponding to the selection operation;
determining a first hue category to which a color corresponding to the selected operation belongs;
and generating a multimedia content searching request according to the first tone category.
In one possible implementation, the color option area is a dial or a bar area with a gradient color.
In one possible implementation, the color option area includes emotion description information corresponding to a color.
In one possible implementation, the multimedia content includes a tone parameter difference of a tone parameter of the multimedia content and a tone parameter corresponding to the second tone class,
the displaying a multimedia content list including the multimedia content includes:
and displaying the multimedia contents in the multimedia content list according to the sequence of the tone parameter differences from small to large.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
In one possible implementation, the hue parameters include lightness and/or saturation.
According to a third aspect of the present disclosure, there is provided an apparatus for processing multimedia content, applied to a server, the apparatus including:
the multimedia content searching system comprises a receiving module, a searching module and a searching module, wherein the receiving module is used for receiving a multimedia content searching request sent by a terminal, and the multimedia content searching request comprises a first tone category;
the first determining module is used for determining the multimedia content according to the first hue category, wherein the second hue category to which the multimedia content belongs is matched with the first hue category;
and the sending module is used for sending the multimedia content to the terminal.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
In one possible implementation, the apparatus further includes:
an acquisition module for acquiring a plurality of sample images from multimedia content;
a second determining module for determining a tone parameter of any of the sample images;
a third determining module, configured to determine a color tone parameter of the multimedia content according to the color tone parameters of the plurality of sample images;
and the fourth determining module is used for determining the second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
In one possible implementation manner, the obtaining module includes:
an obtaining sub-module for obtaining a plurality of frame images from the multimedia content as the plurality of sample images at specified intervals.
In one possible implementation manner, the second determining module includes:
a first determining submodule, configured to determine a plurality of sample pixel points from the sample image;
the second determining submodule is used for determining the tone parameter of any sample pixel point;
and the third determining submodule is used for determining the color tone parameter of the sample image according to the color tone parameters of the plurality of sample pixel points.
In one possible implementation manner, the fourth determining module includes:
a fourth determining submodule, configured to determine a color tone parameter difference between the color tone parameter of the multimedia content and a color tone parameter corresponding to any of the second color tone categories;
and the fifth determining submodule is used for determining that the multimedia content belongs to the second tone category when the tone parameter difference corresponding to the multimedia content and any second tone category is within a threshold value range.
In one possible implementation, the hue parameters include lightness and/or saturation.
According to a fourth aspect of the present disclosure, there is provided an apparatus for processing multimedia content, applied to a terminal, the apparatus including:
the generating module is used for responding to the selection operation aiming at the first tone category and generating a multimedia content searching request comprising the first tone category;
the sending module is used for sending the multimedia content searching request to a server;
the receiving module is used for receiving the multimedia content sent by the server, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
and the display module is used for displaying a multimedia content list comprising the multimedia content.
In one possible implementation manner, the generating module includes:
the first determining submodule is used for responding to a selection operation aiming at the color in the color option area and determining the color corresponding to the selection operation;
the second determining submodule is used for determining the first tone category to which the color corresponding to the selected operation belongs;
and the third determining submodule is used for generating a multimedia content searching request according to the first tone class.
In one possible implementation, the color option area is a dial or a bar area with a gradient color.
In one possible implementation, the color option area includes emotion description information corresponding to a color.
In a possible implementation manner, the multimedia content includes a tone parameter difference between a tone parameter of the multimedia content and a tone parameter corresponding to the second tone category, and the display module includes:
and the display sub-module is used for displaying the multimedia contents in the multimedia content list according to the sequence of the tone parameter differences from small to large.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
In one possible implementation, the hue parameters include lightness and/or saturation.
According to another aspect of the present disclosure, there is provided a processing apparatus of multimedia content, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to perform the above method.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the above-described method.
In this way, after receiving the multimedia content search request sent by the terminal, the server may determine the multimedia content according to the first color tone category included in the multimedia content search request, and send the multimedia content to the terminal. According to the method and the device for processing the multimedia content, provided by the embodiment of the disclosure, the search mode of the multimedia content can be enriched.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments, features, and aspects of the disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
fig. 2 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
fig. 3 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
FIG. 4 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
fig. 5 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
fig. 6 illustrates a flow chart of a method of processing multimedia content according to an embodiment of the present disclosure;
FIG. 7 is a schematic display interface diagram illustrating a method of processing multimedia content according to an embodiment of the present disclosure;
fig. 8 illustrates a schematic structural diagram of a processing apparatus of multimedia content according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a multimedia content processing apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a multimedia content processing apparatus according to an embodiment of the present disclosure;
fig. 11 illustrates a schematic structural diagram of a multimedia content processing apparatus according to an embodiment of the present disclosure;
FIG. 12 is a block diagram illustrating an apparatus 1200 for processing of multimedia content, according to an example embodiment;
fig. 13 is a block diagram illustrating an apparatus 1900 for processing multimedia content according to an example embodiment.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 shows a flowchart of a processing method of multimedia content according to an embodiment of the present disclosure, which may be applied to a server. As shown in fig. 1, the method may include:
step 101, receiving a multimedia content search request sent by a terminal, wherein the multimedia content search request comprises a first tone category.
The first hue category may be a hue category capable of mapping the emotion of the user, and different first hue categories may map different emotions of the user. For example: the first hue category may be a red category, which may map a user's emotional excitement; alternatively, the first hue category may be a blue category, and the blue category may map a pleasant emotion of the user; alternatively, the first hue category may be an orange category, which may map the user's emotional excitement; alternatively, the first hue category may be a black category, and the black category may map a sad emotion of the user; alternatively, the first hue category may be a green category, which may map a user's calm mood, and the like.
It should be noted that the mapping relationship between the different first hue category and the user emotion is only an example in the present disclosure, and is not understood as a limitation to the mapping relationship between the first hue category and the user emotion, and actually, the mapping relationship between the first hue category and the user emotion may be set according to the brightness of the first hue category, for example: the brighter the hue, the more positive the mood of the user can be mapped, and correspondingly, the darker the hue, the more negative the mood of the user can be mapped, which is not limited by the disclosure.
In one possible implementation, the first color tone category is divided according to color tone parameters. The different first hue classes have different hue parameters or different ranges of hue parameters, which may in one possible implementation comprise lightness and/or saturation.
Illustratively, a color option area may be included in the terminal interface, and the color option area may be a dial or a bar-shaped area with gradually changed colors, wherein each color in the color option area may correspond to a first hue category. For example: the user is happy at this time, and the user may select blue color by touch-clicking or the like in the color option area. The terminal determines that the first hue category corresponding to the blue color is the blue color category, can generate a multimedia content search request including the blue color category, and sends the multimedia content search request to the server.
And 102, determining multimedia content according to the first hue class, wherein the second hue class to which the multimedia content belongs is matched with the first hue class.
After receiving the multimedia content search request, the server may determine the multimedia content according to the first color tone class. For example, the multimedia content may have a second tonal category, which may be a picture tonal category of the multimedia content that is capable of mapping an emotion of the viewer, with a different second tonal category having different tonal parameters. In a possible implementation manner, the second hue category is divided according to hue parameters (parameters such as brightness and/or saturation), and the server may determine the second hue category corresponding to the multimedia content according to the hue parameters (parameters such as brightness and/or saturation) corresponding to the multimedia content.
The server may obtain multimedia content having the second hue class matching the first hue class. For example, the first color tone class is a blue color class, and the server may obtain multimedia content of which the second color tone class is a blue color class. The server may preset a matching rule between the second hue category and the first hue category, for example, the same second hue category matches the first hue category, or approximately a plurality of second hue categories match the same first hue category, and so on, which is not limited by this disclosure.
And step 103, sending the multimedia content to the terminal.
The server may transmit the multimedia content determined according to the first tone class to the terminal.
In this way, after receiving the multimedia content search request sent by the terminal, the server may determine the multimedia content according to the first color tone category included in the multimedia content search request, and send the multimedia content to the terminal. According to the processing method of the multimedia content provided by the embodiment of the disclosure, the searching mode of the multimedia content can be enriched.
Fig. 2 illustrates a flowchart of a method of processing multimedia content according to an embodiment of the present disclosure.
In a possible implementation manner, referring to fig. 2, the method may further include the following steps:
step 104, obtaining a plurality of sample images from the multimedia content.
For example, the server may randomly acquire a preset number of frame images from the multimedia content as sample images. The preset number may be a preset value, for example: the preset number is 100, the server may randomly acquire 100 frame images from the multimedia content and use the 100 frame images as sample images.
Alternatively, in one possible implementation, the server may obtain a plurality of frame images as a plurality of sample images from the multimedia content at specified intervals.
The specified interval may be a preset time value or a preset number of frames. For example, assuming that the above-mentioned specified interval is 5 seconds, the server may acquire one frame image from the multimedia content every 5 seconds, and may take the acquired multi-frame image as a sample image. Assuming that the above-mentioned specified interval is 100 frames, the server may acquire one frame of image from the multimedia content every 100 frames, and may take the acquired multiple frames of image as a sample image. The present disclosure does not limit the manner in which the spaces are disposed.
Step 105, determining a tone parameter of any sample image.
The server may determine the hue parameter of the sample image according to the hue parameter (parameters such as brightness and/or saturation) of the pixel point in the sample image.
Fig. 3 illustrates a flowchart of a method of processing multimedia content according to an embodiment of the present disclosure.
In one possible implementation, referring to fig. 3, in the step 105, determining the color tone parameter of any sample image may include the following steps:
step 1051, determining a plurality of sample pixel points from the sample image.
For example, the server may randomly obtain a preset number of pixel points from the sample image as sample pixel points, or the server may uniformly divide the sample image into a preset number of regions, and randomly obtain one pixel point from each region as a sample pixel point, or obtain one pixel point from a designated position in each region as a sample pixel point.
Step 1052, determining a hue parameter of any sample pixel point;
and 1053, determining the color tone parameter of the sample image according to the color tone parameters of the plurality of sample pixel points.
The server can determine the tone parameter of any sample pixel point, and can determine the average value of the tone parameters of all the sample pixel points as the tone parameter of the sample image. For example: the color parameters comprise lightness and saturation, after the server obtains the N sample pixel points, the mean value of the lightness of the N sample pixel points can be determined as the lightness of the sample image, and the mean value of the saturation of the N sample pixel points can be determined as the saturation of the sample image.
And 106, determining the tone parameters of the multimedia content according to the tone parameters of the plurality of sample images.
For example, after the server determines the tone parameter of each sample image, the server may determine the average of the tone parameters of all the sample images as the tone parameter of the multimedia content.
For example: the server determines brightness and saturation of the M sample images, and then determines the average of the brightness of the M sample images as brightness of the multimedia content, and determines the average of the saturation of the M sample images as saturation of the multimedia content.
And step 107, determining a second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
Fig. 4 illustrates a flowchart of a method of processing multimedia content according to an embodiment of the present disclosure.
In a possible implementation manner, referring to fig. 4, the step 107 of determining the second hue category to which the multimedia content belongs according to the hue parameter of the multimedia content may include the following steps:
step 1071, determining the difference between the tone parameter of said multimedia content and the tone parameter corresponding to any second tone category.
Different hue parameters may be set for different second hue classes. The server may determine a difference in tone parameter of the multimedia content corresponding to any of the second tone categories. For example: the server may determine a lightness difference between the lightness of the multimedia content and the lightness corresponding to any one of the second hue classes, and determine a saturation difference between the saturation of the multimedia content and the saturation corresponding to any one of the second hue classes.
Step 1072, determining that the multimedia content belongs to any second hue class when the difference between the hue parameters of the multimedia content and the second hue class is within a threshold range.
The threshold range is a preset tone parameter difference range, and when the tone parameter difference of the multimedia content is within the threshold range, the server may determine that the multimedia content belongs to the second tone category.
For example: assuming that the hue parameters include lightness and saturation, and the threshold ranges corresponding to lightness and saturation are 10 (actually, the threshold ranges corresponding to lightness and saturation may also be different), the difference between the hue parameters of the multimedia content and the red category capable of mapping the user's fulminant emotion includes: lightness difference 5, saturation difference 11, the difference in hue parameters between the multimedia content and the green category to which the user's calm emotion can be mapped, include: lightness difference 16, saturation difference 11, multimedia content and hue parameter differences for blue categories that can map the user's happy mood include: lightness difference 5 and saturation difference 6, the server may determine that the multimedia content belongs to the blue category, which may be suitable for viewing by the user in a happy mood.
The determination may also be made according to a comparison of one of the brightness difference and the saturation difference with a corresponding threshold, which is not limited by the present disclosure.
In this way, the server may determine the color tone parameter of the multimedia content according to the color tone parameter of the sample image obtained from the multimedia content, determine the second color tone category to which the multimedia content belongs according to the color tone parameter of the multimedia content, classify the multimedia content according to the color tone, and then search for the multimedia content having the matching second color tone category according to the first color tone category in the multimedia content search request after receiving the multimedia content search request sent by the terminal. According to the processing method of the multimedia content provided by the embodiment of the disclosure, the search mode of the multimedia content can be enriched.
Fig. 5 shows a flowchart of a processing method of multimedia content according to an embodiment of the present disclosure, which may be applied to a terminal, for example: mobile phones, tablet computers, vehicle-mounted terminals and the like. As shown in fig. 5, the method may include:
step 501, responding to a selection operation aiming at a first tone category, and generating a multimedia content searching request comprising the first tone category.
The first hue category may be a hue category capable of mapping the emotion of the user, and different first hue categories may map different emotions of the user.
For example, a color option area may be included in the terminal interface, wherein the color option area may include a plurality of colors, each color may map a mood, and each color may correspond to a first hue category. In one possible implementation, the color option area may be a dial or a bar-shaped area with a gradient color.
The user can select a color capable of mapping the current emotion of the user in the color option area by clicking, touching or the like. The terminal can respond to the selection operation of the user, determine the color selected by the user, and determine the first tone category corresponding to the color. For example: the user is happy at this time, and the user can select blue that can map a happy emotion by touch-clicking or the like in the color option area. The terminal determines that the first hue category corresponding to the blue color is the blue color category, and the terminal may generate a multimedia content search request including the blue color category.
Step 502, sending the multimedia content search request to a server.
After the terminal generates the multimedia content search request, the multimedia content search request may be sent to the server, so that the server may obtain the multimedia content according to the multimedia content search request (for example, if the first hue category is a blue category, the server may obtain the multimedia content of a blue category (a second hue category) capable of mapping a happy emotion), where the process of obtaining the multimedia content by the server according to the multimedia content search request may refer to the foregoing embodiment, and details of the embodiment of the present disclosure are not repeated herein.
Step 503, receiving the multimedia content sent by the server, wherein the second color tone category to which the multimedia content belongs is matched with the first color tone category.
And the terminal receives the multimedia content sent by the server, wherein the second tone category to which the multimedia content belongs is matched with the first tone category. For example, the first hue category is a blue category representing joyous emotion, the second hue category of the multimedia content is also a blue category representing joyous emotion, the user requests the multimedia content of joyous emotion category through the first hue category, and the server returns the multimedia content of joyous emotion category to the terminal according to the multimedia content searching request.
Step 504, displaying a multimedia content list including the multimedia content.
After receiving the multimedia content sent by the server, the terminal can display a multimedia content list comprising the multimedia content on a display interface for a user to browse, and further select interested multimedia content from the multimedia content list.
In this way, the terminal can generate a multimedia content search request including the first tone category in response to the selection operation of the first tone category and transmit the multimedia content search request to the server. And after receiving the multimedia content which is sent by the server and has the second tone category matched with the first tone category, displaying a multimedia content list comprising the multimedia content on the display interface. According to the multimedia content processing method provided by the embodiment of the disclosure, the terminal can respond to the first tone category selected by the user, and acquire the corresponding multimedia content according to the first tone category, so that the search mode of the multimedia content can be enriched.
Fig. 6 illustrates a flowchart of a method of processing multimedia content according to an embodiment of the present disclosure.
In a possible implementation manner, referring to fig. 6, the step 501, in response to the selection operation for the first hue category, of generating the multimedia content search request including the first hue category may include the following steps:
step 5011, responding to the selection operation aiming at the color in the color option area, and determining the color corresponding to the selection operation.
For example, the color selection area may include a plurality of colors, each of which may represent an emotion of the user. The user may select a color capable of representing the current emotion from the color selection area by clicking, touching, adjusting the direction of the pointer for selecting the color, sliding the position of the slider for selecting the color, and the like. The terminal can respond to the selection operation and determine the color corresponding to the selection operation.
In a possible implementation manner, the color option area may be a dial or a bar-shaped area with a gradient color. For example, colors of the same color family but different lightness and/or saturation may be included in the color option area, wherein colors of different lightness and/or saturation of the same color family may characterize different degrees of strength of the same emotion, for example: the lower the lightness and/or saturation, the lower the intensity of the characterizing mood.
In one possible implementation, the color option area may include emotion description information corresponding to a color.
For example, the region to which the different color belongs may include emotion description information corresponding to the color, and the emotion description information may be used to describe the emotion of the user that can be characterized by the color. For example: the red color may represent a violent mood of the user, and the red area may include mood descriptive information "violent"; blue may represent a happy emotion of the user, and the blue area may include please description information "happy"; orange may characterize the mood of the user's excitement, the orange area may include the descriptive information "excitement"; black may represent a sad emotion of the user, the black area may include the descriptive information "sad"; the green color may characterize the mood of the user to fade, and the green area may include the descriptive information "fade", etc.
Therefore, the emotion of the user, which can be represented by different colors, can be prompted through the emotion description information, so that the user can accurately select the color capable of expressing the emotion of the user, and the multimedia content corresponding to the emotion of the user can be more accurately acquired.
Step 5012, determining a first hue category to which the color corresponding to the selected operation belongs.
Step 5013, generating a multimedia content search request according to the first color tone category.
After determining the first hue category to which the color corresponding to the selection operation belongs, the terminal may generate a multimedia content search request according to the first hue category, where the multimedia search request may include the first hue category.
In this way, the terminal can determine the first hue category corresponding to the color selected by the user and capable of representing the emotion of the user, and generate a multimedia content search request according to the first hue category, so as to obtain the corresponding multimedia content from the server through the multimedia content search request. The second tone category corresponding to the multimedia content is matched with the first tone category, the first tone category corresponds to the color capable of representing the emotion of the user, and the multimedia content and the color capable of representing the emotion of the user have a corresponding relation.
In a possible implementation manner, the multimedia content includes a difference between a tone parameter of the multimedia content and a tone parameter of a tone parameter corresponding to the second tone category, and the displaying a multimedia content list including the multimedia content in step 504 may include:
and displaying the multimedia contents in the multimedia content list according to the sequence of the tone parameter differences from small to large.
When determining the second hue category to which the multimedia content belongs, the server may record a hue parameter difference between the multimedia content hue parameter and a hue parameter corresponding to the second hue category (the process for determining the hue parameter difference may refer to the foregoing embodiment, and details of the present disclosure are not described herein again). After the server searches for the multimedia content according to the multimedia content and obtains the multimedia content corresponding to the first tone category (the second tone category of the multimedia content is matched with the first tone category), the server obtains the tone parameter difference between the multimedia content and the second tone category and sends the multimedia content and the corresponding tone parameter difference to the terminal. After the terminal receives the multimedia contents and the corresponding tone parameter differences, the multimedia contents can be displayed in a multimedia list according to the sequence of the tone parameter differences of the multimedia contents from small to large.
Therefore, as the tone parameter difference is smaller, the tone of the multimedia content is more consistent with the second tone category, and the matching degree with the first tone category is higher, the smaller the tone parameter difference is, the more front the position in the multimedia list is, the more reasonable the display mode of the multimedia content can be, and the searching efficiency of the user can be improved.
Fig. 7 illustrates a display interface diagram of a processing method of multimedia content according to an embodiment of the present disclosure.
The present disclosure is described below by way of specific examples in order to enable those skilled in the art to better understand the embodiments of the present disclosure.
Illustratively, when the terminal opens a client or a webpage for browsing and viewing multimedia contents in response to a trigger operation of a user, the terminal has prompt information on a display interface (for example, a position such as the lower right corner of the current interface): the little mood watches the movie hall. The user may trigger the prompt if he wants to search for multimedia content based on mood. The terminal responds to the trigger operation of the user on the prompt message, displays a first page, the first display area of the first page includes a color option area (in the current example, the color option area is a rotating disc with multiple colors, and actually, each color may also be a gradient color from light to dark), and the second display area of the first page is used for displaying a multimedia content list.
The user is happy at present, blue which can represent the current emotion of the user is selected from the turntable through trigger operations such as adjusting a turntable pointer, touching or clicking, the terminal determines that the blue is selected by the user in response to the trigger operations, determines that the first hue category corresponding to the blue is the blue category, generates a multimedia content searching request according to the blue category, and sends the multimedia content searching request to the server.
After receiving the multimedia content searching request, the server searches for multimedia content 1 to multimedia content 6 with blue categories according to the multimedia content searching request (the color tone of the multimedia content corresponding to the blue categories is brighter), and sends the multimedia content 1 to the multimedia content 6 to the terminal.
After the terminal receives the multimedia contents 1 to 6 in the blue category, a multimedia content list including the multimedia contents 1 to 6 is displayed in the second display area, as shown in fig. 7.
In fact, the above-mentioned rotating disc may also be a nested rotating disc, for example: the terminal comprises a turntable with two layers, wherein a first layer positioned in the turntable can be an emotion option area, a second layer positioned outside the turntable can be a weather option area (including weather such as wind, frost, rain, snow and fog), a user can select a first day from weather options after selecting a corresponding first color tone type in the emotion option area, and the terminal can generate a multimedia content searching request according to the first color tone type and the first day. The server can obtain the multimedia content corresponding to the first tone category and the first antenna according to the multimedia content searching request, and send the obtained multimedia content to the terminal.
Fig. 8 is a schematic structural diagram illustrating an apparatus for processing multimedia content, which may be applied to a server, according to an embodiment of the present disclosure, and as shown in fig. 8, the apparatus may include:
a receiving module 801, configured to receive a multimedia content search request sent by a terminal, where the multimedia content search request includes a first hue category;
a first determining module 802, configured to determine multimedia content according to the first hue category, where a second hue category to which the multimedia content belongs matches the first hue category;
a sending module 803, configured to send the multimedia content to the terminal.
In this way, after receiving the multimedia content search request sent by the terminal, the server may determine the multimedia content according to the first color tone category included in the multimedia content search request, and send the multimedia content to the terminal. According to the processing device of the multimedia content provided by the embodiment of the disclosure, the searching mode of the multimedia content can be enriched.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
Fig. 9 is a schematic structural diagram of a multimedia content processing apparatus according to an embodiment of the present disclosure.
In a possible implementation manner, referring to fig. 9, the apparatus may further include:
an obtaining module 804, configured to obtain a plurality of sample images from multimedia content;
a second determination module 805, operable to determine a tone parameter of any of the sample images;
a third determining module 806, configured to determine a color tone parameter of the multimedia content according to the color tone parameters of the plurality of sample images;
a fourth determining module 807 configured to determine a second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
In a possible implementation manner, referring to fig. 9, the obtaining module 804 may include:
the obtaining sub-module 8041 may be configured to obtain a plurality of frame images from the multimedia content as the plurality of sample images at specified intervals.
In a possible implementation manner, referring to fig. 9, the second determining module 805 may include:
a first determining submodule 8051, configured to determine a plurality of sample pixel points from the sample image;
a second determining submodule 8052, configured to determine a hue parameter of any of the sample pixel points;
the third determining submodule 8053 may be configured to determine the color tone parameter of the sample image according to the color tone parameters of the plurality of sample pixel points.
In a possible implementation manner, referring to fig. 9, the fourth determining module 807 may include:
a fourth determining submodule 8071, configured to determine a difference between a hue parameter of the multimedia content and a hue parameter corresponding to any one of the second hue categories;
the fifth determining sub-module 8072 may be configured to determine that the multimedia content belongs to any second hue category when the difference between the hue parameters of the multimedia content and the second hue category is within a threshold range.
In one possible implementation, the hue parameters may include lightness and/or saturation.
Fig. 10 is a schematic structural diagram illustrating an apparatus for processing multimedia content according to an embodiment of the present disclosure, which may be applied to a terminal, as shown in fig. 10, the apparatus may include:
a generating module 1001, configured to generate a multimedia content search request including a first hue category in response to a selection operation for the first hue category;
a sending module 1002, configured to send the multimedia content search request to a server;
a receiving module 1003, configured to receive a multimedia content sent by the server, where a second color tone category to which the multimedia content belongs is matched with the first color tone category;
a display module 1004 may be configured to display a multimedia content list including the multimedia content.
In this way, the terminal can generate a multimedia content search request including the first tone category in response to the selection operation of the first tone category and transmit the multimedia content search request to the server. And after receiving the multimedia content which is sent by the server and has the second tone category matched with the first tone category, displaying a multimedia content list comprising the multimedia content on the display interface. According to the processing device for the multimedia content provided by the embodiment of the disclosure, the terminal can respond to the first tone category selected by the user, and acquire the corresponding multimedia content according to the first tone category, so that the search mode of the multimedia content can be enriched.
Fig. 11 is a schematic structural diagram of a multimedia content processing apparatus according to an embodiment of the present disclosure.
In one possible implementation manner, referring to fig. 11, the generating module 1001 may include:
a first determining sub-module 10011, configured to, in response to a selection operation for a color in the color option area, determine a color corresponding to the selection operation;
the second determining sub-module 10012 may be configured to determine a first hue category to which a color corresponding to the selected operation belongs;
the third determining sub-module 10013 may be configured to generate a multimedia content search request according to the first color tone class.
In one possible implementation, the color option area is a dial or a bar area with a gradient color.
In one possible implementation, the color option area includes emotion description information corresponding to a color.
In a possible implementation manner, the multimedia content includes a tone parameter difference between a tone parameter of the multimedia content and a tone parameter corresponding to the second tone category, and referring to fig. 11, the display module 1004 may include:
the display sub-module 10041 may be configured to display the multimedia contents in the multimedia content list according to an order of the hue parameter differences from small to large.
In one possible implementation, the first and second hue categories are divided according to hue parameters.
In one possible implementation, the hue parameters include lightness and/or saturation.
Fig. 12 is a block diagram illustrating an apparatus 1200 for processing multimedia content according to an example embodiment. For example, the apparatus 1200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 12, the apparatus 1200 may include one or more of the following components: processing component 1202, memory 1204, power component 1206, multimedia component 1208, audio component 1210, input/output (I/O) interface 1212, sensor component 1214, and communications component 1216.
The processing component 1202 generally controls overall operation of the apparatus 1200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 1202 may include one or more processors 1220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 1202 can include one or more modules that facilitate interaction between the processing component 1202 and other components. For example, the processing component 1202 can include a multimedia module to facilitate interaction between the multimedia component 1208 and the processing component 1202.
The memory 1204 is configured to store various types of data to support operation at the apparatus 1200. Examples of such data include instructions for any application or method operating on the device 1200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1204 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 1206 provides power to the various components of the device 1200. Power components 1206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for apparatus 1200.
The multimedia components 1208 include a screen that provides an output interface between the device 1200 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 1200 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 1210 is configured to output and/or input audio signals. For example, audio component 1210 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1204 or transmitted via the communication component 1216. In some embodiments, audio assembly 1210 further includes a speaker for outputting audio signals.
The I/O interface 1212 provides an interface between the processing component 1202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1214 includes one or more sensors for providing various aspects of state assessment for the apparatus 1200. For example, the sensor assembly 1214 may detect an open/closed state of the apparatus 1200, the relative positioning of the components, such as a display and keypad of the apparatus 1200, the sensor assembly 1214 may also detect a change in the position of the apparatus 1200 or a component of the apparatus 1200, the presence or absence of user contact with the apparatus 1200, orientation or acceleration/deceleration of the apparatus 1200, and a change in the temperature of the apparatus 1200. The sensor assembly 1214 may include a proximity sensor configured to detect the presence of a nearby object in the absence of any physical contact. The sensor assembly 1214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communications component 1216 is configured to facilitate communications between the apparatus 1200 and other devices in a wired or wireless manner. The apparatus 1200 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1216 receives the broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1204, is also provided, including computer program instructions executable by the processor 1220 of the apparatus 1200 to perform the methods described above.
Fig. 13 is a block diagram illustrating an apparatus 1900 for processing multimedia content according to an example embodiment. For example, the apparatus 1900 may be provided as a server. Referring to FIG. 13, the device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by the processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The device 1900 may also include a power component 1926 configured to perform power management of the device 1900, a wired or wireless network interface 1950 configured to connect the device 1900 to a network, and an input/output (I/O) interface 1958. The device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, MacOS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the apparatus 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terms used herein were chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the techniques in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (30)

1. A method for processing multimedia content, applied to a server, the method comprising:
receiving a multimedia content searching request sent by a terminal, wherein the multimedia content searching request comprises a first tone category;
determining multimedia content according to the first tone category, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
and sending the multimedia content to the terminal.
2. The method of claim 1, wherein the first hue category and the second hue category are divided according to hue parameters.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
obtaining a plurality of sample images from multimedia content;
determining a hue parameter of any of the sample images;
determining a tone parameter of the multimedia content according to the tone parameters of the plurality of sample images;
and determining a second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
4. The method of claim 3, wherein obtaining the plurality of sample images from the multimedia content comprises:
a plurality of frame images are acquired from the multimedia content at specified intervals as the plurality of sample images.
5. The method of claim 3 or 4, wherein said determining a hue parameter of any of said sample images comprises:
determining a plurality of sample pixel points from the sample image;
determining a hue parameter of any sample pixel point;
and determining the tone parameters of the sample image according to the tone parameters of the plurality of sample pixel points.
6. The method according to any one of claims 3 to 5, wherein the determining the second hue category to which the multimedia content belongs according to the hue parameter of the multimedia content comprises:
determining a tone parameter difference of a tone parameter of the multimedia content corresponding to any second tone category;
and when the color tone parameter difference corresponding to the multimedia content and any second color tone category is within a threshold value range, determining that the multimedia content belongs to the second color tone category.
7. The method according to any one of claims 2 to 6, wherein the hue parameters comprise lightness and/or saturation.
8. A method for processing multimedia content, applied to a terminal, the method comprising:
in response to a selection operation for a first tone category, generating a multimedia content lookup request including the first tone category;
sending the multimedia content searching request to a server;
receiving multimedia content sent by the server, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
displaying a multimedia content list including the multimedia content.
9. The method of claim 8, wherein generating a multimedia content lookup request including a first tonal category in response to a selection operation for the first tonal category comprises:
responding to a selection operation aiming at colors in a color option area, and determining a color corresponding to the selection operation;
determining a first hue category to which a color corresponding to the selected operation belongs;
and generating a multimedia content searching request according to the first tone category.
10. The method of claim 9, wherein the color option area is a wheel or a bar having a gradient color.
11. The method according to claim 9 or 10, wherein the color option area includes emotion description information corresponding to a color.
12. The method according to any of claims 8 to 11, wherein the multimedia content comprises a difference in tone parameter between the tone parameter of the multimedia content and the tone parameter corresponding to the second tone class,
the displaying a multimedia content list including the multimedia content includes:
and displaying the multimedia contents in the multimedia content list according to the sequence of the tone parameter differences from small to large.
13. The method according to any of claims 8 to 12, wherein the first and second hue categories are divided according to hue parameters.
14. The method according to any one of claims 8 to 13, wherein the hue parameters comprise lightness and/or saturation.
15. An apparatus for processing multimedia content, applied to a server, the apparatus comprising:
the multimedia content searching system comprises a receiving module, a searching module and a searching module, wherein the receiving module is used for receiving a multimedia content searching request sent by a terminal, and the multimedia content searching request comprises a first tone category;
the first determining module is used for determining the multimedia content according to the first hue category, wherein the second hue category to which the multimedia content belongs is matched with the first hue category;
and the sending module is used for sending the multimedia content to the terminal.
16. The apparatus of claim 15, wherein the first hue class and the second hue class are divided according to hue parameters.
17. The apparatus of claim 15 or 16, further comprising:
an acquisition module for acquiring a plurality of sample images from multimedia content;
a second determining module for determining a tone parameter of any of the sample images;
a third determining module, configured to determine a color tone parameter of the multimedia content according to the color tone parameters of the plurality of sample images;
and the fourth determining module is used for determining the second tone category to which the multimedia content belongs according to the tone parameter of the multimedia content.
18. The apparatus of claim 17, wherein the obtaining module comprises:
an obtaining sub-module for obtaining a plurality of frame images from the multimedia content as the plurality of sample images at specified intervals.
19. The apparatus of claim 17 or 18, wherein the second determining module comprises:
a first determining submodule, configured to determine a plurality of sample pixel points from the sample image;
the second determining submodule is used for determining the tone parameter of any sample pixel point;
and the third determining submodule is used for determining the color tone parameter of the sample image according to the color tone parameters of the plurality of sample pixel points.
20. The apparatus of any of claims 17 to 19, wherein the fourth determining module comprises:
a fourth determining submodule, configured to determine a color tone parameter difference between the color tone parameter of the multimedia content and a color tone parameter corresponding to any of the second color tone categories;
and the fifth determining submodule is used for determining that the multimedia content belongs to the second tone category when the tone parameter difference corresponding to the multimedia content and any second tone category is within a threshold value range.
21. The apparatus according to any of claims 16 to 20, wherein the hue parameters comprise lightness and/or saturation.
22. An apparatus for processing multimedia contents, applied to a terminal, the apparatus comprising:
the generating module is used for responding to the selection operation aiming at the first tone category and generating a multimedia content searching request comprising the first tone category;
the sending module is used for sending the multimedia content searching request to a server;
the receiving module is used for receiving the multimedia content sent by the server, wherein the second tone category to which the multimedia content belongs is matched with the first tone category;
and the display module is used for displaying a multimedia content list comprising the multimedia content.
23. The apparatus of claim 22, wherein the generating module comprises:
the first determining submodule is used for responding to a selection operation aiming at the color in the color option area and determining the color corresponding to the selection operation;
the second determining submodule is used for determining the first tone category to which the color corresponding to the selected operation belongs;
and the third determining submodule is used for generating a multimedia content searching request according to the first tone class.
24. The apparatus of claim 23, wherein the color option area is a wheel or a bar with a gradient color.
25. The apparatus of claim 23 or 24, wherein the color option area includes emotion description information corresponding to a color.
26. The apparatus according to any one of claims 22 to 25, wherein the multimedia content comprises a difference between a tone parameter of the multimedia content and a tone parameter corresponding to the second tone category, and the display module comprises:
and the display sub-module is used for displaying the multimedia contents in the multimedia content list according to the sequence of the tone parameter differences from small to large.
27. The apparatus according to any of claims 22 to 26, wherein the first and second hue classes are divided according to hue parameters.
28. The apparatus according to any of claims 22 to 27, wherein the hue parameters comprise lightness and/or saturation.
29. An apparatus for processing multimedia content, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any one of claims 1 to 14.
30. A non-transitory computer readable storage medium having computer program instructions stored thereon, wherein the computer program instructions, when executed by a processor, implement the method of any of claims 1 to 14.
CN201810966752.1A 2018-08-23 2018-08-23 Multimedia content processing method and device Pending CN110858913A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810966752.1A CN110858913A (en) 2018-08-23 2018-08-23 Multimedia content processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810966752.1A CN110858913A (en) 2018-08-23 2018-08-23 Multimedia content processing method and device

Publications (1)

Publication Number Publication Date
CN110858913A true CN110858913A (en) 2020-03-03

Family

ID=69636046

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810966752.1A Pending CN110858913A (en) 2018-08-23 2018-08-23 Multimedia content processing method and device

Country Status (1)

Country Link
CN (1) CN110858913A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241595A (en) * 2007-02-06 2008-08-13 中国科学院计算技术研究所 Vision frequency affective communication extraction method
CN101271528A (en) * 2008-04-11 2008-09-24 北京中星微电子有限公司 Method and device for outputting image
US20090271740A1 (en) * 2008-04-25 2009-10-29 Ryan-Hutton Lisa M System and method for measuring user response
CN103577534A (en) * 2013-08-30 2014-02-12 百度在线网络技术(北京)有限公司 Searching method and search engine
CN105261374A (en) * 2015-09-23 2016-01-20 海信集团有限公司 Cross-media emotion correlation method and system
CN105830006A (en) * 2014-01-30 2016-08-03 华为技术有限公司 Emotion modification for image and video content
CN105843922A (en) * 2016-03-25 2016-08-10 乐视控股(北京)有限公司 Multimedia classification recommendation method, apparatus and system
CN106407287A (en) * 2016-08-29 2017-02-15 宇龙计算机通信科技(深圳)有限公司 Multimedia resource pushing method and system
WO2017027204A1 (en) * 2015-08-10 2017-02-16 Google Inc. Privacy aligned and personalized social media content sharing recommendations
CN106599204A (en) * 2016-12-15 2017-04-26 广州酷狗计算机科技有限公司 Method and device for recommending multimedia content
CN106874330A (en) * 2016-07-27 2017-06-20 阿里巴巴集团控股有限公司 A kind of resource supplying method and apparatus
CN109472207A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Emotion identification method, apparatus, equipment and storage medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101241595A (en) * 2007-02-06 2008-08-13 中国科学院计算技术研究所 Vision frequency affective communication extraction method
CN101271528A (en) * 2008-04-11 2008-09-24 北京中星微电子有限公司 Method and device for outputting image
US20090271740A1 (en) * 2008-04-25 2009-10-29 Ryan-Hutton Lisa M System and method for measuring user response
CN103577534A (en) * 2013-08-30 2014-02-12 百度在线网络技术(北京)有限公司 Searching method and search engine
CN105830006A (en) * 2014-01-30 2016-08-03 华为技术有限公司 Emotion modification for image and video content
WO2017027204A1 (en) * 2015-08-10 2017-02-16 Google Inc. Privacy aligned and personalized social media content sharing recommendations
CN105261374A (en) * 2015-09-23 2016-01-20 海信集团有限公司 Cross-media emotion correlation method and system
CN105843922A (en) * 2016-03-25 2016-08-10 乐视控股(北京)有限公司 Multimedia classification recommendation method, apparatus and system
CN106874330A (en) * 2016-07-27 2017-06-20 阿里巴巴集团控股有限公司 A kind of resource supplying method and apparatus
CN106407287A (en) * 2016-08-29 2017-02-15 宇龙计算机通信科技(深圳)有限公司 Multimedia resource pushing method and system
CN106599204A (en) * 2016-12-15 2017-04-26 广州酷狗计算机科技有限公司 Method and device for recommending multimedia content
CN109472207A (en) * 2018-10-11 2019-03-15 平安科技(深圳)有限公司 Emotion identification method, apparatus, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN108932253B (en) Multimedia search result display method and device
CN107948708B (en) Bullet screen display method and device
CN107692997B (en) Heart rate detection method and device
CN107729522B (en) Multimedia resource fragment intercepting method and device
US9924226B2 (en) Method and device for processing identification of video file
US20200007944A1 (en) Method and apparatus for displaying interactive attributes during multimedia playback
CN112465843A (en) Image segmentation method and device, electronic equipment and storage medium
CN110858924B (en) Video background music generation method and device and storage medium
CN109947981B (en) Video sharing method and device
CN108924644B (en) Video clip extraction method and device
EP3147802B1 (en) Method and apparatus for processing information
CN108495168B (en) Bullet screen information display method and device
WO2018157630A1 (en) Method and device for recommending associated user
CN108320208B (en) Vehicle recommendation method and device
CN108900903B (en) Video processing method and device, electronic equipment and storage medium
CN106792255B (en) Video playing window frame body display method and device
CN111242303A (en) Network training method and device, and image processing method and device
CN108174269B (en) Visual audio playing method and device
CN107402767B (en) Method and device for displaying push message
CN109756783B (en) Poster generation method and device
CN106331328B (en) Information prompting method and device
CN105677352B (en) Method and device for setting application icon color
CN109151553B (en) Display control method and device, electronic equipment and storage medium
CN109992754B (en) Document processing method and device
CN113032627A (en) Video classification method and device, storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200426

Address after: 310052 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 100000 room 26, 9 Building 9, Wangjing east garden four, Chaoyang District, Beijing.

Applicant before: BEIJING YOUKU TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200303