CN112272317B - Playing parameter determination method and device, electronic equipment and storage medium - Google Patents

Playing parameter determination method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112272317B
CN112272317B CN202011119233.5A CN202011119233A CN112272317B CN 112272317 B CN112272317 B CN 112272317B CN 202011119233 A CN202011119233 A CN 202011119233A CN 112272317 B CN112272317 B CN 112272317B
Authority
CN
China
Prior art keywords
waveform data
parameter
waveform
playing
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011119233.5A
Other languages
Chinese (zh)
Other versions
CN112272317A (en
Inventor
杨昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011119233.5A priority Critical patent/CN112272317B/en
Publication of CN112272317A publication Critical patent/CN112272317A/en
Application granted granted Critical
Publication of CN112272317B publication Critical patent/CN112272317B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25866Management of end-user data
    • H04N21/25891Management of end-user data being end-user preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The application discloses a playing parameter determining method and device, electronic equipment and a storage medium. The method comprises the following steps: acquiring first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process; determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform category in the first waveform data; the first parameter represents a video playing requirement corresponding to electroencephalogram data; and determining the playing parameters based on the first parameters corresponding to the first waveform data.

Description

Playing parameter determination method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of video processing technologies, and in particular, to a method and an apparatus for determining a playing parameter, an electronic device, and a storage medium.
Background
In the related art, in order to ensure smooth playing of a video, a playing parameter is generally determined based on a network environment and video quality, so that the obtained playing parameter is not matched with a user requirement, and the use efficiency of network resources is reduced.
Disclosure of Invention
In view of this, embodiments of the present application provide a method, an apparatus, an electronic device, and a storage medium for determining a video playback parameter, so as to at least solve the problem that the playback parameter is not matched with a user requirement, which occurs in the related art, and reduces the usage efficiency of network resources.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a method for determining playing parameters, which comprises the following steps:
acquiring first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process;
determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform category in the first waveform data; the first parameter represents a video playing requirement corresponding to the electroencephalogram data;
and determining the playing parameters based on the first parameters corresponding to the first waveform data.
In the foregoing solution, the determining a first parameter corresponding to first waveform data according to a transformation condition of a waveform class in the first waveform data includes:
segmenting the first waveform data to obtain a first wave band and a second wave band;
determining a first parameter corresponding to first waveform data in a first set relation table according to a waveform type combination consisting of the waveform type of the first waveband and the waveform type of the second waveband; the first setting relation table is used for storing the corresponding relation between each waveform type combination in at least one waveform type combination and the first parameter.
In the foregoing solution, the segmenting the first waveform data to obtain the first band and the second band includes:
dividing the first waveform data based on a first duration corresponding to the first waveform data to obtain the first wave band and the second wave band; the first duration characterizes a duration of the first waveform data.
In the foregoing solution, the determining the playing parameter based on the first parameter corresponding to the first waveform data includes:
determining an adjustment category corresponding to a first parameter corresponding to the first waveform data in a second set relation table; the second set relation table is used for storing the corresponding relation between each adjustment category in at least one adjustment category and the first parameter;
and determining the playing parameters according to the adjustment category corresponding to the first parameters corresponding to the first waveform data and the playing parameters corresponding to the first waveform data when the first waveform data is collected.
In the above solution, the playing parameter is a video bitrate, and the adjustment category includes one of:
maintaining the current video code rate;
setting a video code rate as a first code rate; the first code rate represents the highest video code rate or the lowest video code rate corresponding to the first waveform data in the acquisition process;
adjusting the video code rate according to the second parameter; the second parameter represents an adjustment coefficient of the video code rate.
In the foregoing solution, when acquiring the first waveform data, the method further includes:
and filtering the first waveform data based on a set waveform type.
In the foregoing solution, the acquiring first waveform data includes:
first waveform data with the duration time larger than or equal to the set duration time is acquired.
In the above scheme, the method further comprises:
generating a first request according to the playing parameters; the first request is used for requesting a server to acquire video data corresponding to the playing parameters;
and receiving and loading the video data which is returned by the server based on the first request and accords with the playing parameters.
An embodiment of the present application further provides a device for determining a playing parameter, including:
an acquisition unit configured to acquire first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process;
the first determining unit is used for determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform type in the first waveform data; the first parameter represents a video playing requirement corresponding to the electroencephalogram data;
a second determining unit, configured to determine a playing parameter based on the first parameter.
An embodiment of the present application further provides an electronic device, including: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is configured to perform the steps of any of the above methods when running the computer program.
Embodiments of the present application also provide a storage medium having a computer program stored thereon, where the computer program is executed by a processor to implement the steps of any one of the above methods.
In the embodiment of the application, first waveform data are obtained, the first waveform data represent electroencephalogram data collected in a video playing process, first parameters corresponding to the first waveform data are determined according to the change condition of waveform categories in the first waveform data, the first parameters represent video playing requirements corresponding to the electroencephalogram data, playing parameters are determined based on the first parameters corresponding to the first waveform data, and therefore the parameters of video playing can be determined based on the requirements of a user, the video played can meet the video playing requirements, meanwhile, the resource cost of video playing is reduced, and the watching experience of the user is improved.
Drawings
Fig. 1 is a schematic view illustrating an implementation flow of a playing parameter determining method according to an embodiment of the present application;
fig. 2 is a schematic diagram of electroencephalogram data according to an embodiment of the present application;
fig. 3 is a schematic flow chart illustrating an implementation of a playing parameter determining method according to an embodiment of the present application;
fig. 4 is a schematic diagram of emotional states corresponding to different waveform types according to an embodiment of the present application;
FIG. 5 is a diagram illustrating a first table according to an embodiment of the present application;
fig. 6 is a schematic flow chart illustrating an implementation of a playing parameter determining method according to an embodiment of the present application;
fig. 7 is a corresponding relationship between video playing requirements and adjustment categories according to an embodiment of the present application;
fig. 8 is a schematic flow chart illustrating an implementation of a playing parameter determining method according to an embodiment of the present application;
fig. 9 is a schematic flowchart of a playing parameter determination process provided in an application embodiment of the present application;
fig. 10 is a schematic flow chart illustrating a playing parameter determination according to another application embodiment of the present application;
fig. 11 is a schematic structural diagram of a playback parameter determining apparatus according to an embodiment of the present application;
fig. 12 is a schematic diagram of a hardware component structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
The present application will be described in further detail with reference to the following drawings and specific embodiments.
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The technical means described in the embodiments of the present application may be arbitrarily combined without conflict.
In addition, in the embodiments of the present application, "first", "second", and the like are used for distinguishing similar objects, and are not necessarily used for describing a particular order or sequence.
An embodiment of the present application provides a method for determining a playing parameter, and fig. 1 is a schematic flowchart of the method for determining a playing parameter according to the embodiment of the present application. As shown in fig. 1, the method includes:
s101: acquiring first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process.
Here, first waveform data is acquired, wherein the first waveform data is brain wave data acquired during video playing, and the first waveform data can reflect emotion changes of a user when watching a video. In practical application, when a user watches a video, electroencephalogram data of the user in the process of watching the video can be collected. The first waveform data can be acquired through customized wearable equipment, for example, a user wears a customized earphone in the process of watching a video, electroencephalogram data are acquired through the earphone, electroencephalogram data can also be acquired through worn glasses, the wearable equipment sends the acquired electroencephalogram data to electronic equipment, and the first waveform data are analyzed through the electronic equipment.
In one embodiment, when the first waveform data is acquired, the method further includes:
and filtering the first waveform data based on a set waveform type.
In practical application, collected brain wave data may contain invalid data, such as noise data, or brain wave data with a special waveform, which are invalid data, and the accuracy of the first parameter can be improved after filtering the invalid data. In practical application, four waveforms of delta wave, beta wave, alpha wave and theta wave in electroencephalogram data belong to effective waveform data, and when the first waveform data is filtered, waveform amplitudes which do not belong to the four waveforms can be discarded.
In the above embodiment, the first waveform data is filtered based on the set waveform type, so that the first waveform data contains effective waveform data, and the video playing requirement can be accurately determined according to the electroencephalogram data of the user.
In one embodiment, the acquiring the first waveform data includes:
first waveform data with the duration time larger than or equal to the set duration time is acquired.
Here, first waveform data having a duration greater than or equal to a set time period is acquired, and waveform data having a duration greater than or equal to a set time period is determined as the first waveform data from the collected brain wave data, the duration being a length of the brain wave data, and exemplarily, when the duration is 1S, it indicates that the brain wave data describes waveform data of 1S. When the duration of the first waveform data is too short, for example, the first waveform data is a transient waveform, the emotion of the user in watching a video cannot be accurately recorded, and in practical applications, the set duration may be 3S, and the first waveform data with the duration of 3S is obtained. As shown in fig. 2, fig. 2 is a schematic diagram illustrating brain wave data in which the brain wave data is divided into 6 units of small bands in fig. 2, wherein each unit of small band corresponds to waveform data having a duration of 1S, and if first waveform data having a duration of 3S is acquired, 3 units of small bands in the brain wave data constitute the first waveform data. In practical application, the start time of the video is acquired, and the first waveform data can be acquired every set time length according to the start time of the video because the brain wave data of the user is acquired while the video is played.
In the embodiment, the first waveform data with the duration time longer than or equal to the set duration time is acquired, so that the emotion of the user in the process of watching the video can be accurately fed back, and the corresponding video playing requirement can be more accurately determined according to the emotion of the user.
S102: determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform category in the first waveform data; the first parameter represents a video playing requirement corresponding to the brain wave data.
After the first waveform data are acquired, the first waveform data are analyzed, and a first parameter corresponding to the first waveform data is determined according to the transformation condition of the waveform category in the first waveform data, wherein the first parameter represents the video playing requirement corresponding to the electroencephalogram data. In practical application, the emotion change of a user can occur in the process of watching a video, the emotion change of the user can be recorded through the waveform transformation of the first waveform data, for example, the emotion change can be fed back through the waveform amplitude and the waveform type in the first waveform data, the satisfaction degree of the user when watching the video can be reflected according to the emotion change, and therefore the first parameter can be determined according to the subjective feeling of the user. Illustratively, when the change of the waveform category in the first waveform data indicates that the video sense of the user fluctuates between normal and poor, it indicates that the current corresponding video playing requirement is to improve the playing quality of the video.
In an embodiment, as shown in fig. 3, the determining the first parameter corresponding to the first waveform data according to the transformation condition of the waveform class in the first waveform data includes:
s301: and dividing the first waveform data to obtain a first wave band and a second wave band.
Here, the first waveform data is divided, and a first band and a second band, which can be spliced to obtain the first waveform data, are obtained on the basis of the first waveform. In practical applications, the first waveform data includes at least one of four waveforms of a δ wave, a β wave, an α wave, and a θ wave, and after the first waveform data is divided, the first wavelength band and the second wavelength band also include at least one of four waveforms of a δ wave, a β wave, an α wave, and a θ wave. In practical applications, the first waveform data may be divided according to the duration of the first waveform data.
In an embodiment, the dividing the first waveform data to obtain the first band and the second band includes:
dividing the first waveform data based on a first duration corresponding to the first waveform data to obtain the first wave band and the second wave band; the first duration characterizes a duration of the first waveform data.
Here, when the first waveform data is divided, a first duration corresponding to the first waveform data is determined to obtain a first band and a second band, where the first duration represents a duration of the first waveform data, and in practical applications, the first waveform data is uniformly divided according to the first duration corresponding to the first waveform data, and the duration of the obtained first band is the same as the duration of the second band, for example, the first duration corresponding to the first waveform data is 3S, 1.5S in the first waveform data is taken as a division point of the first waveform data, the first 1.5S in the first waveform data is determined as the first band, and the last 1.5S in the first waveform data is determined as the second band.
In the embodiment, the first waveform data is segmented based on the first duration corresponding to the first waveform data to obtain the first wave band and the second wave band, and the first duration represents the duration of the first waveform data, so that the emotion change of the user in the process of watching a video can be accurately determined through segmenting the brain wave data and through two different wave bands, and the requirement of the user for watching the video can be favorably determined.
S302: determining a first parameter corresponding to first waveform data in a first set relation table according to a waveform type combination consisting of the waveform type of the first waveband and the waveform type of the second waveband; the first setting relation table is used for storing the corresponding relation between each waveform type combination in at least one waveform type combination and the first parameter.
Here, the waveform type of the first wavelength band and the waveform type of the second wavelength band are determined, the waveform type of the first wavelength band and the waveform type of the second wavelength band constitute a waveform type combination, for example, when the waveform type of the first wavelength band is a delta wave, the waveform type of the second wavelength band is a beta wave, and the waveform type combination is a delta wave + a wave, a first parameter corresponding to the first waveform data is determined in a first setting relationship table according to the waveform type combination of the waveform type of the first wavelength band and the waveform type combination of the second wavelength band, and the first setting relationship table is used for storing a corresponding relationship between each set of waveform type combination in at least one set of waveform type combinations and the first parameter. In practical applications, emotions corresponding to each waveform type are different, as shown in fig. 4, fig. 4 shows an emotion state schematic diagram corresponding to different waveform types, when the emotion of a user changes during watching a video, the waveform type in the corresponding first waveform data also changes, in order to better judge the waveform type in the first waveform data, the first waveform data is divided, and the waveform types in the first wave band and the second wave band are compared, so that the change of the emotion of the user can be well determined, the emotions represented by the conversion between different waveform types are different, so that the different first parameters are corresponding, as shown in fig. 5, fig. 5 shows a schematic diagram of a first setting relationship table, in the first relationship table, the corresponding relationship between different waveform type combinations and the first parameters is stored, therefore, corresponding first parameters can be determined according to different emotion changes, in practical application, video playing requirements can be measured through user senses, the video playing requirements can be divided into eight requirements according to the user senses, the first requirements correspond to the user senses which are good, the second requirements correspond to the user senses which are poor, the third requirements correspond to the user senses which fluctuate between good and general, the fourth requirements correspond to the user senses which fluctuate between good and poor, the fifth requirements correspond to the user senses which fluctuate between general and poor, the sixth requirements correspond to the user senses which are not strong, the seventh requirements correspond to the user senses which are lost, the video playing requirements are represented as improved video playing quality when the user senses which fluctuate between general and poor, and therefore the video playing requirements can be measured through the user senses, when the waveform type combination formed by the waveform type of the first wave band and the waveform type of the second wave band is theta wave + alpha wave, the fact that the sense organ of a user fluctuates between general and poor conditions in the process of watching a video shows that the current video parameters cannot meet the requirements of the user on the video, and the corresponding first parameters show that the playing parameters need to be improved, so that the output video can meet the requirements of the user.
In the above embodiment, the first waveform data is divided to obtain the first band and the second band, and the first parameter corresponding to the first waveform data is determined in the first set relationship table according to the waveform class combination composed of the waveform class of the first band and the waveform class of the second band, where the first set relationship table is used to store the corresponding relationship between each group of waveform class combination in at least one group of waveform class combinations and the first parameter, so that the requirement of a user on a video can be determined according to waveform transformation of electroencephalogram data, and thus, video parameters can be accurately adjusted, and network resources can be reasonably utilized.
S103: and determining the playing parameters based on the first parameters corresponding to the first waveform data.
Here, the playing parameter is determined based on the first parameter corresponding to the first waveform data, so that the playing parameter can be adjusted according to subjective requirements of a user in a process of watching a video, so that the played video can meet the requirements of the user, and the playing parameter is adaptively adjusted.
In the above embodiment, the first waveform data is obtained, the first waveform data represents electroencephalogram data acquired in a video playing process, the first parameter corresponding to the first waveform data is determined according to the change condition of the waveform category in the first waveform data, the first parameter represents the video playing requirement corresponding to the electroencephalogram data, and the playing parameter is determined based on the first parameter corresponding to the first waveform data, so that the parameter of video playing can be determined according to the requirement of a user, the utilization rate of network resources is favorably improved, the resource cost of video playing is saved, and the experience of the user for watching videos is also improved.
In an embodiment, as shown in fig. 6, the determining the playing parameter based on the first parameter corresponding to the first waveform data includes:
s601: determining an adjustment category corresponding to a first parameter corresponding to the first waveform data in a second set relation table; the second setting relationship table is used for storing the corresponding relationship between each adjustment category in at least one adjustment category and the first parameter.
Here, after the first parameter corresponding to the first waveform data is determined, an adjustment category corresponding to the first parameter corresponding to the first waveform data is determined in a second set relationship table based on the first parameter corresponding to the first waveform data, the adjustment category being an adjustment rule in which the playback parameter is determined, the second set relationship table being used to store a correspondence relationship between each of at least one adjustment category and the first parameter. In practical application, the emotion change of a user in the process of watching a video can reflect the video playing requirements of the user, different video playing requirements correspond to different playing parameters, and therefore the corresponding adjustment category needs to be determined according to the first parameter. In practical applications, adjusting the category at least includes maintaining the playing parameters, increasing the playing parameters, decreasing the playing parameters, and the like.
S602: and determining the playing parameters according to the adjustment category corresponding to the first parameters corresponding to the first waveform data and the playing parameters corresponding to the first waveform data when the first waveform data is collected.
Here, the playing parameter of the current video is determined according to the adjustment category corresponding to the first parameter corresponding to the first waveform data and the playing parameter corresponding to the time of acquiring the first waveform data, and in practical application, the playing parameter can be obtained after the adjustment corresponding to the adjustment category is performed on the playing parameter corresponding to the time of acquiring the first waveform data. Illustratively, when the adjustment category is to maintain the playback parameters, the determined playback parameters are the same as the corresponding playback parameters when the first waveform data was acquired.
In the above embodiment, the adjustment category corresponding to the first parameter corresponding to the first waveform data is determined in the second setting relationship table, and the second setting relationship table is used to store the corresponding relationship between each adjustment category of the at least one adjustment category and the first parameter, and determine the playing parameter according to the adjustment category corresponding to the first parameter corresponding to the first waveform data and the playing parameter corresponding to the time of acquiring the first waveform data, so that the playing parameter can be adjusted according to the video playing requirement of the user, the played video can meet the requirement of the user, and the use efficiency of the network resource can be improved.
In an embodiment, the playing parameter is a video bitrate, and the adjustment category includes one of:
maintaining the current video code rate;
setting a video code rate as a first code rate; the first code rate represents the highest video code rate or the lowest video code rate corresponding to the first waveform data in the acquisition process;
adjusting the video code rate according to the second parameter; the second parameter represents an adjustment coefficient of the video code rate.
Here, when the playing parameter that needs to be adjusted is the video bitrate, the total includes 3 adjustment categories, the first adjustment category is to maintain the current video bitrate, and when the user sense is kept good or the user sense is lost, it is described that the current video bitrate meets the video playing requirement of the user or the user has no requirement on the video playing quality, and in this case, the current video bitrate is maintained. The second adjustment category is setting the video code rate as the first code rate, and the first code rate is the highest video code rate or the lowest video code rate corresponding to the first waveform data in the acquisition process. When the sense of the user fluctuates between good and poor or between general and poor, it is indicated that in the process of watching the video by the user, the video fluctuates between the video playing requirement according with the user and the video playing requirement not according with the user, and in this case, the video bitrate is set to be the highest video bitrate corresponding to the acquisition process of the first waveform data, so that the output video meets the video playing requirement of the user. The third adjustment category is to adjust the video bitrate according to the second parameter, where the second parameter is an adjustment parameter of the video bitrate, and the video bitrate in the acquisition process of the first waveform data is adjusted by the second parameter to obtain the video bitrate, and for example, when the sense retention of a user is poor, the video bitrate is adjusted by using the second parameter. As shown in fig. 7, fig. 7 shows a corresponding relationship between video playing requirements and adjustment categories, and adjusts a video bitrate corresponding to the first waveform data in the acquisition process according to different video playing requirements.
In the above embodiment, the playing parameter is a video bitrate, and the adjusting the category includes maintaining the current video bitrate; setting the video code rate as a first code rate, wherein the first code rate represents the highest video code rate or the lowest video code rate corresponding to the acquisition process of the first waveform data; and adjusting the video code rate according to the second parameter, wherein the second parameter represents one of the adjustment parameters of the video code rate, so that the video code rate can be adaptively adjusted according to different video playing requirements, and video data meeting the video playing requirements can be output while the minimum network resources are ensured to be used.
In an embodiment, as shown in fig. 8, the method further comprises:
s801: generating a first request according to the playing parameters; the first request is used for requesting a server to acquire video data corresponding to the playing parameters.
After the playing parameters are determined, a first request is generated according to the playing parameters, and the first request is used for requesting the server to acquire the video data corresponding to the playing parameters. In practical application, when the playing parameter needs to be changed, a first request is sent to the server, and the playing parameter is carried in the first request, so that the video data conforming to the playing parameter can be acquired from the server. In practical application, when the first request is generated, the description file of the media file is analyzed to generate the first request.
S802: and receiving and loading the video data which is returned by the server based on the first request and accords with the playing parameters.
Here, the receiving server receives the video data which is returned based on the first request and conforms to the playing parameters, and loads the received video data, so that the played video data can conform to the video playing requirement.
In the above embodiment, the first request is generated according to the playing parameters, and the first request is used to request the server to acquire the video data corresponding to the playing parameters, and receive and load the video data which is returned by the server based on the first request and conforms to the playing parameters, so that the output video data can be ensured to conform to the video playing requirement, and the resource cost of video playing is saved.
An application embodiment is further provided in the present application, as shown in fig. 9, where fig. 9 shows a schematic flow chart of video playing parameter determination.
S901: the brain wave data are collected through the customization equipment, and the collected brain wave data are sent to the mobile terminal.
S902: and filtering the electroencephalogram data based on the set waveform type.
S903: and acquiring first waveform data with the duration time being more than or equal to the set time length from the filtered electroencephalogram data.
S904: determining a first parameter according to the first waveform data.
S905: and determining the playing parameters according to the first parameters.
S906: and sending a first request to the server according to the playing parameters.
S907: and the server returns the video data which accords with the playing parameters to the terminal.
S908: and loading the video data which accord with the playing parameters.
The application also provides another application embodiment. In an application embodiment of the present application, as shown in fig. 10, fig. 10 shows a schematic flow chart of play parameter determination, in a process of watching a video by a user, brain wave data of the user is collected through a customized earphone, the collected brain wave data is transmitted to a mobile terminal, the mobile terminal performs data processing on the collected brain wave data to obtain effective first waveform data, the first waveform data is analyzed to determine a video play requirement and determine a play parameter meeting the video play requirement, the terminal sends a first request to a server to obtain video data meeting the play parameter, and the server returns the video data meeting the play parameter to the terminal after receiving the first request sent by the terminal, so that the terminal can play the play parameter meeting the video play requirement.
In order to implement the method according to the embodiment of the present application, an embodiment of the present application further provides a device for determining a playing parameter, where as shown in fig. 11, the device includes:
an acquisition unit 1101 configured to acquire first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process;
a first determining unit 1102, configured to determine a first parameter corresponding to first waveform data according to a transformation condition of a waveform class in the first waveform data; the first parameter represents a video playing requirement corresponding to electroencephalogram data;
a second determining unit 1103, configured to determine a playing parameter based on the first parameter.
In an embodiment, the determining, by the first determining unit 1102, a first parameter corresponding to first waveform data according to a transform condition of a waveform class in the first waveform data includes:
segmenting the first waveform data to obtain a first wave band and a second wave band;
determining a first parameter corresponding to first waveform data in a first set relation table according to a waveform type combination consisting of the waveform type of the first waveband and the waveform type of the second waveband; the first setting relation table is used for storing the corresponding relation between each waveform type combination in at least one waveform type combination and the first parameter.
In an embodiment, the dividing the first waveform data by the first determining unit 1102 to obtain the first band and the second band includes:
dividing the first waveform data based on a first duration corresponding to the first waveform data to obtain the first wave band and the second wave band; the first duration characterizes a duration of the first waveform data.
In an embodiment, the second determining unit 1103 determines the playing parameter based on the first parameter corresponding to the first waveform data, including:
determining an adjustment category corresponding to a first parameter corresponding to the first waveform data in a second set relationship table; the second set relation table is used for storing the corresponding relation between each adjustment category in at least one adjustment category and the first parameter;
and determining the playing parameters according to the adjustment category corresponding to the first parameters corresponding to the first waveform data and the playing parameters corresponding to the first waveform data when the first waveform data is collected.
In an embodiment, the playing parameter is a video bitrate, and the adjustment category determined by the second determining unit 1103 includes one of:
maintaining the current video code rate;
setting a video code rate as a first code rate; the first code rate represents the highest video code rate or the lowest video code rate corresponding to the first waveform data in the acquisition process;
adjusting the video code rate according to the second parameter; the second parameter represents an adjustment coefficient of the video code rate.
In an embodiment, when the acquiring unit 1101 acquires the first waveform data, the method further includes:
and filtering the first waveform data based on a set waveform type.
In an embodiment, the acquiring unit 1101 acquires first waveform data, including:
first waveform data with the duration time larger than or equal to the set duration time is obtained.
In one embodiment, the apparatus further comprises:
the generating unit is used for generating a first request according to the playing parameters; the first request is used for requesting a server to acquire video data corresponding to the playing parameters;
and the loading unit is used for receiving and loading the video data which is returned by the server based on the first request and accords with the playing parameters.
In actual application, the obtaining unit 1101, the first determining unit 1102, and the second determining unit 1103 may be implemented by a processor in the playback parameter determining apparatus. Of course, the processor needs to run the program stored in the memory to realize the functions of the above-described program modules.
It should be noted that, when the playing parameter determining apparatus provided in the embodiment of fig. 11 determines the playing parameter, the division of each program module is merely used as an example, and in practical applications, the processing may be distributed and completed by different program modules as needed, that is, the internal structure of the apparatus may be divided into different program modules to complete all or part of the processing described above. In addition, the playing parameter determining apparatus and the playing parameter determining method provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments and are not described herein again.
Based on the hardware implementation of the program module, and in order to implement the method according to the embodiment of the present application, an embodiment of the present application further provides an electronic device, and fig. 12 is a schematic diagram of a hardware composition structure of the electronic device according to the embodiment of the present application, and as shown in fig. 12, the electronic device includes:
a communication interface 1 capable of information interaction with other devices such as network devices and the like;
and the processor 2 is connected with the communication interface 1 to realize information interaction with other equipment, and is used for executing the playing parameter determination method provided by one or more technical schemes when running a computer program. And the computer program is stored on the memory 3.
In practice, of course, the various components in the electronic device are coupled together by the bus system 4. It will be appreciated that the bus system 4 is used to enable connection communication between these components. The bus system 4 comprises, in addition to a data bus, a power bus, a control bus and a status signal bus. But for clarity of illustration the various buses are labeled as bus system 4 in figure 12.
The memory 3 in the embodiment of the present application is used to store various types of data to support the operation of the electronic device. Examples of such data include: any computer program for operating on an electronic device.
It will be appreciated that the memory 3 can be either volatile memory or nonvolatile memory, and can include both volatile and nonvolatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a magnetic random access Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), synchronous Static Random Access Memory (SSRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), synchronous Dynamic Random Access Memory (SLDRAM), direct Memory (DRmb Access), and Random Access Memory (DRAM). The memory 3 described in the embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
The method disclosed in the above embodiment of the present application may be applied to the processor 2, or implemented by the processor 2. The processor 2 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware integrated logic circuits or software in the processor 2. The processor 2 described above may be a general purpose processor, DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, or the like. The processor 2 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in the memory 3, and the processor 2 reads the program in the memory 3 and in combination with its hardware performs the steps of the aforementioned method.
When the processor 2 executes the program, the corresponding processes in the methods according to the embodiments of the present application are realized, and for brevity, are not described herein again.
In an exemplary embodiment, the present application further provides a storage medium, i.e., a computer storage medium, specifically a computer readable storage medium, for example, including a memory 3 storing a computer program, where the computer program is executable by a processor 2 to perform the steps of the foregoing method. The computer readable storage medium may be Memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash Memory, magnetic surface Memory, optical disk, or CD-ROM.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, terminal and method may be implemented in other manners. The above-described device embodiments are only illustrative, for example, the division of the unit is only one logical function division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application or portions thereof that contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media capable of storing program code.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A method for determining playback parameters, comprising:
acquiring first waveform data; the first waveform data represents electroencephalogram data collected in a video playing process; the first waveform data is used for recording emotion changes generated by a user based on video playing quality;
determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform type in the first waveform data; the first parameter represents the quality requirement of video playing corresponding to the electroencephalogram data;
determining a playing parameter based on a first parameter corresponding to the first waveform data;
wherein, the determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform type in the first waveform data includes:
determining a first parameter corresponding to the first waveform data according to a waveform type combination composed of a waveform type of a first waveband and a waveform type of a second waveband in the first waveform data and a corresponding relation between the waveform type combination and the first parameter;
the determining the playing parameters based on the first parameters corresponding to the first waveform data includes:
determining an adjustment category corresponding to a first parameter according to the corresponding relation between the adjustment category and the first parameter;
determining the playing parameters according to the adjustment category and corresponding playing parameters when the first waveform data is collected; the adjustment category comprises at least one adjustment rule defining playback parameters.
2. The method for determining playing parameters according to claim 1, wherein the determining the first parameters corresponding to the first waveform data according to the waveform type combination composed of the waveform type of the first wave band and the waveform type of the second wave band in the first waveform data and the corresponding relationship between the waveform type combination and the first parameters comprises:
segmenting the first waveform data to obtain a first wave band and a second wave band;
determining a first parameter corresponding to first waveform data in a first set relation table according to a waveform type combination consisting of the waveform type of the first waveband and the waveform type of the second waveband; the first setting relation table is used for storing the corresponding relation between each waveform type combination in at least one waveform type combination and the first parameter.
3. The playback parameter determination method of claim 2, wherein the dividing the first waveform data into the first band and the second band comprises:
dividing the first waveform data based on a first duration corresponding to the first waveform data to obtain the first wave band and the second wave band; the first duration characterizes a duration of the first waveform data.
4. The method for determining playing parameters according to claim 1, wherein the determining the adjustment category corresponding to the first parameter according to the correspondence between the adjustment category and the first parameter includes:
determining an adjustment category corresponding to a first parameter corresponding to the first waveform data in a second set relationship table; the second setting relationship table is used for storing the corresponding relationship between each adjustment category in at least one adjustment category and the first parameter.
5. The method of claim 4, wherein the playback parameter is a video bitrate, and the adjustment category comprises one of:
keeping the current video code rate;
setting a video code rate as a first code rate; the first code rate represents the highest video code rate or the lowest video code rate corresponding to the first waveform data in the acquisition process;
adjusting the video code rate according to the second parameter; the second parameter represents an adjustment coefficient of the video code rate.
6. The playback parameter determination method according to claim 1, wherein when the first waveform data is acquired, the method further comprises:
and filtering the first waveform data based on a set waveform type.
7. The playback parameter determination method according to claim 1, wherein the acquiring the first waveform data includes:
first waveform data with the duration time larger than or equal to the set duration time is acquired.
8. The playback parameter determination method according to claim 1, further comprising:
generating a first request according to the playing parameters; the first request is used for requesting a server to acquire video data corresponding to the playing parameters;
and receiving and loading the video data which is returned by the server based on the first request and accords with the playing parameters.
9. A playback parameter determination apparatus, comprising:
an acquisition unit configured to acquire first waveform data; the first waveform data represent brain wave data collected in the video playing process; the first waveform data is used for recording emotion changes generated by a user based on video playing quality;
the first determining unit is used for determining a first parameter corresponding to the first waveform data according to the transformation condition of the waveform category in the first waveform data; the first parameter represents the quality requirement of video playing corresponding to the electroencephalogram data;
a second determining unit, configured to determine a playing parameter based on the first parameter;
the first determining unit is further configured to determine a first parameter corresponding to the first waveform data according to a waveform type combination consisting of a waveform type of a first waveband and a waveform type of a second waveband in the first waveform data and a corresponding relationship between the waveform type combination and the first parameter;
the second determining unit is further configured to determine an adjustment category corresponding to the first parameter according to a corresponding relationship between the adjustment category and the first parameter; determining the playing parameters according to the adjustment category and corresponding playing parameters when the first waveform data is collected; the adjustment category comprises at least one adjustment rule determining a playback parameter.
10. An electronic device, comprising: a processor and a memory for storing a computer program capable of running on the processor,
wherein the processor is adapted to perform the steps of the method of any one of claims 1 to 8 when running the computer program.
11. A storage medium on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
CN202011119233.5A 2020-10-19 2020-10-19 Playing parameter determination method and device, electronic equipment and storage medium Active CN112272317B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011119233.5A CN112272317B (en) 2020-10-19 2020-10-19 Playing parameter determination method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011119233.5A CN112272317B (en) 2020-10-19 2020-10-19 Playing parameter determination method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112272317A CN112272317A (en) 2021-01-26
CN112272317B true CN112272317B (en) 2023-02-17

Family

ID=74338218

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011119233.5A Active CN112272317B (en) 2020-10-19 2020-10-19 Playing parameter determination method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112272317B (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105163180A (en) * 2015-08-21 2015-12-16 小米科技有限责任公司 Play control method, play control device and terminal
CN107205193B (en) * 2017-07-28 2019-06-04 京东方科技集团股份有限公司 A kind of control method for playing back, earphone and smart machine
CN107837089B (en) * 2017-12-05 2018-11-23 清华大学 A kind of video cardton limit value measurement method based on brain wave
CN108495191A (en) * 2018-02-11 2018-09-04 广东欧珀移动通信有限公司 Video playing control method and related product
CN108429853A (en) * 2018-02-12 2018-08-21 广东欧珀移动通信有限公司 Electronic device, method for switching network and Related product
TWI746924B (en) * 2019-01-14 2021-11-21 瑞軒科技股份有限公司 Video recommending system and video recommending method
CN110353705B (en) * 2019-08-01 2022-10-25 秒针信息技术有限公司 Method and device for recognizing emotion

Also Published As

Publication number Publication date
CN112272317A (en) 2021-01-26

Similar Documents

Publication Publication Date Title
US10687155B1 (en) Systems and methods for providing personalized audio replay on a plurality of consumer devices
US10297265B2 (en) Personal audio assistant device and method
CN111381954B (en) Audio data recording method, system and terminal equipment
CN110312146B (en) Audio processing method and device, electronic equipment and storage medium
CN105120321A (en) Video searching method, video storage method and related devices
EP3678388A1 (en) Customized audio processing based on user-specific and hardware-specific audio information
US20170083262A1 (en) System and method for controlling memory frequency using feed-forward compression statistics
CN111274415B (en) Method, device and computer storage medium for determining replacement video material
CN111182315A (en) Multimedia file splicing method, device, equipment and medium
US20220358941A1 (en) Audio encoding and decoding method and audio encoding and decoding device
CN106331089A (en) Video play control method and system
US10755707B2 (en) Selectively blacklisting audio to improve digital assistant behavior
CN110570348A (en) Face image replacement method and device
CN112272317B (en) Playing parameter determination method and device, electronic equipment and storage medium
CN107613356A (en) Media and vibrations synchronous broadcast method and device, electronic equipment and storage medium
CN108647102B (en) Service request processing method and device of heterogeneous system and electronic equipment
CN112463391B (en) Memory control method, memory control device, storage medium and electronic equipment
JP6985355B2 (en) Interference avoidance processing methods, devices, storage media, and programs
CN105847990A (en) Media file playing method and apparatus
CN110581918B (en) Voice equipment, control method thereof, server and computer storage medium
CN114339325B (en) Multi-engine dynamic wallpaper playing method and device based on android system
CN112653896B (en) House source information playback method and device with viewing assistant, electronic equipment and medium
CN111447267B (en) Information synchronization method, device, computer readable storage medium and equipment
CN115473927A (en) Volume synchronization method and device, electronic equipment and storage medium
CN107357547B (en) Audio control method, audio control device and audio equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant