CN110839180A - Video playing progress control method and device and electronic equipment - Google Patents

Video playing progress control method and device and electronic equipment Download PDF

Info

Publication number
CN110839180A
CN110839180A CN201910940155.6A CN201910940155A CN110839180A CN 110839180 A CN110839180 A CN 110839180A CN 201910940155 A CN201910940155 A CN 201910940155A CN 110839180 A CN110839180 A CN 110839180A
Authority
CN
China
Prior art keywords
video
control information
video playing
time point
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910940155.6A
Other languages
Chinese (zh)
Inventor
王东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910940155.6A priority Critical patent/CN110839180A/en
Publication of CN110839180A publication Critical patent/CN110839180A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/57Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for processing of video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream

Abstract

The embodiment of the invention discloses a video playing progress control method and device and electronic equipment, and relates to the technical field of video control. The method comprises the following steps: receiving video playing control information; inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis; and sending the first time point to a video playing module so that the video playing module jumps to the first time point from the current playing time point and plays the video which is played currently. The invention is suitable for the occasions of controlling, editing and applying videos in the fields of commerce, entertainment, teaching, scientific research and the like.

Description

Video playing progress control method and device and electronic equipment
Technical Field
The invention relates to the technical field of video control, in particular to a video playing progress control method and device and electronic equipment.
Background
The editing and using of videos are increasingly applied to business, entertainment, teaching and research, and currently, when a video is played, if a user wants to jump to a time point corresponding to a certain playing content to play the playing content, the user needs to manually drag a progress bar of a player and search the progress bar point by point to locate the desired video playing content. The method for controlling the video playing progress is complicated and cannot quickly locate the playing time point corresponding to the video content to be watched.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for controlling a video playing progress, and an electronic device, which can easily and quickly locate a playing time point corresponding to a desired video content.
In a first aspect, an embodiment of the present invention provides a method for controlling a video playing progress, where the method includes:
receiving video playing control information;
inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis;
and sending the first time point to a video playing module so that the video playing module jumps to the first time point from the current playing time point and plays the video which is played currently.
With reference to the first aspect, in a first implementation manner of the first aspect, the video playing control information is voice control information;
the receiving video playing control information includes:
receiving voice control information input by a user;
recognizing the voice control information to obtain a command keyword in the voice control information;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the command keyword, and acquiring a first time point on a video playing time axis corresponding to the voice control information from the video control file.
With reference to the first aspect and the first implementation manner of the first aspect, in a second implementation manner of the first aspect, the video playing control information is motion control information;
the receiving video playing control information includes:
receiving an action sensing signal acquired by a sensor; the sensor is arranged on the limb of a video viewer, and the motion sensing signal is triggered by the limb motion of the video viewer, or the sensor is arranged on the first electronic equipment, and the motion sensing signal is triggered by the motion of the video viewer touching the sensor of the electronic equipment;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the action sensing signal, and acquiring a first time point on a video playing time axis corresponding to the action sensing signal from the video control file.
With reference to the first aspect and any one of the first to the second implementation manners of the first aspect, in a third implementation manner of the first aspect, the video playing control information is human body physiological index control information;
the receiving video playing control information includes:
receiving a physiological index sensing signal acquired by a sensor; the sensor is arranged on the body of a video viewer, and the physiological sensing signal is triggered by the change of the physiological index of the video viewer;
analyzing the physiological index sensing signal to obtain a physiological index value;
determining the number of the threshold range interval where the physiological index value is located;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the threshold range interval number of the physiological index sensing signal, and acquiring a first time point on a video playing time axis corresponding to the threshold range interval number of the physiological index sensing signal from the video control file.
With reference to the first aspect and any one of the first to third embodiments of the first aspect, in a fourth embodiment of the first aspect, the video playback control information is brain wave control information;
the receiving video playing control information includes:
receiving brain waves of a user;
analyzing the brain waves to obtain the attention concentration reading;
determining a threshold range interval number for the concentration reading;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the number of the threshold range interval where the attention concentration reading is located, and acquiring a first time point on a video playing time axis corresponding to the number of the threshold range interval where the attention concentration reading is located from the video control file.
In a second aspect, an embodiment of the present invention provides a video playing progress control device, where the device includes:
the receiving module is used for receiving video playing control information;
the query acquisition module is used for querying a video control file corresponding to the video according to the video playing control information and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis;
and the sending module is used for sending the first time point to the video playing module so that the video playing module skips the currently played video from the current playing time point to the first time point and plays the video.
With reference to the second aspect, in a first implementation manner of the second aspect, the video playing control information is voice control information;
the receiving module includes:
the first receiving unit is used for receiving voice control information input by a user;
the voice recognition unit is used for recognizing the voice control information to obtain a command keyword in the voice control information;
the query obtaining module is specifically configured to query a video control file corresponding to the video according to the command keyword, and obtain a first time point on a video playing time axis corresponding to the voice control information from the video control file.
The video playing control information is voice control information;
the receiving module includes:
the first receiving unit is used for receiving voice control information input by a user;
the voice recognition unit is used for recognizing the voice control information to obtain a command keyword in the voice control information;
the matching unit is used for matching the command keywords with a pre-stored standard keyword list to obtain corresponding standard keywords; the standard keyword list stores a mapping relation between the command keywords and the standard keywords;
the query obtaining module is specifically configured to query a video control file corresponding to the video according to the standard keyword, and obtain a first time point on a video playing time axis corresponding to the voice control information from the video control file.
With reference to the second aspect and the first implementation manner of the second aspect, in a second implementation manner of the second aspect, the video playing control information is motion control information;
the receiving module includes:
the second receiving unit is used for receiving the action sensing signals collected by the sensor; the sensor is arranged on the limb of a video viewer, and the motion sensing signal is triggered by the limb motion of the video viewer, or the sensor is arranged on the first electronic equipment, and the motion sensing signal is triggered by the motion of the video viewer touching the sensor of the electronic equipment;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the motion sensing signal, and acquire a first time point on a video playing time axis corresponding to the motion sensing signal from the video control file.
With reference to the second aspect, the first or second implementation manner of the second aspect, in a third implementation manner of the second aspect, the video playback control information is human body physiological index control information;
the receiving module includes:
the third receiving unit is used for receiving the physiological index sensing signal acquired by the sensor; the sensor is arranged on the body of a video viewer, and the physiological sensing signal is triggered by the change of the physiological index of the video viewer;
the physiological index analysis unit is used for analyzing the physiological index sensing signal to obtain a physiological index value;
the first determining unit is used for determining the number of the threshold range interval where the physiological index value is located;
the query module is specifically configured to query a video control file corresponding to the video according to the threshold range interval number where the physiological indicator sensing signal is located, and acquire a first time point on a video playing time axis corresponding to the threshold range interval number where the physiological indicator sensing signal is located from the video control file.
With reference to the second aspect, the first or third implementation manner of the second aspect, in a fourth implementation manner of the second aspect, the video playing control information is brain wave control information;
the receiving module includes:
a fourth receiving unit for receiving brain waves of the user;
the brain wave analysis unit is used for analyzing the brain waves to obtain the attention concentration reading;
a second determination unit that determines a threshold range section number of the attention concentration reading;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the threshold range interval number where the attention concentration reading is located, and acquire, from the video control file, a first time point on a video playing time axis corresponding to the threshold range interval number where the attention concentration reading is located
In a third aspect, an embodiment of the present invention provides an electronic device, where the electronic device includes: the device comprises a shell, a processor, a memory, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; a power supply circuit for supplying power to each circuit or device of the electronic apparatus; the memory is used for storing executable program codes; the processor reads the executable program code stored in the memory to run a program corresponding to the executable program code, so as to execute the video playing progress control method according to any one of the first aspect.
The embodiment of the invention provides a method, a device and an electronic device for controlling video playing progress, wherein the method, the device and the electronic device receive video playing control information; inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; and sending the first time point to a video playing module so that the video playing module jumps to the first time point from the current playing time point and plays the video which is played currently. Therefore, the playing time point corresponding to the video content to be watched can be conveniently and quickly located without manually dragging the video progress bar and searching point by point to locate the video playing content, so that the user experience of watching the video, editing the video and the like in different application scenes can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a video playing progress control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a video playing progress control process based on voice control information according to an embodiment of the present invention;
FIG. 3 is a signaling diagram of a video playing progress control process based on voice control information according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a video playing progress control process based on motion control information according to an embodiment of the present invention;
FIG. 5 is a signaling diagram of a video playing progress control process based on motion control information according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a video control file including motion control information according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating a video control file including motion control information according to another embodiment of the present invention;
FIG. 8 is a flowchart illustrating a method for generating a video control file according to an embodiment of the present invention;
fig. 9A to 9D are schematic diagrams of an embodiment of a graphical user interface of a first electronic device having a video playing window according to an embodiment of the invention;
FIG. 10A is a signaling diagram illustrating a video control file generation process according to an embodiment of the present invention;
FIG. 10B is a signaling diagram illustrating a video control file generation process according to another embodiment of the present invention;
fig. 11 is a block diagram of a video playing progress control device according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The video playing progress control method, the video playing progress control device and the electronic equipment are suitable for occasions of controlling, editing and applying videos in the fields of commerce, entertainment, teaching, scientific research and the like.
Example one
Fig. 1 is a flowchart of a video playing progress control method according to an embodiment of the present invention, and as shown in fig. 1, the video playing progress control method according to the embodiment of the present invention may be applied to a first electronic device having a video player; the first electronic equipment comprises a mobile phone, a computer, an MP4 and the like. The method comprises the following steps:
step 101, receiving video playing control information.
In this embodiment, the video playing control information may be sent in the form of sound such as voice, that is, voice control information; the information can be sent in a visual form of motion, posture and the like, namely motion control information; or can be sent in a sensible form such as human body physiological indexes and the like, and human body physiological index control information; the information can also be transmitted in the form of energy such as brain waves, namely brain wave control information; when the video playing control information is sent in the form of the carriers, the corresponding identification device may be built in the first electronic device, or the corresponding identification device may be separately set and sent to the first electronic device. For example, when the voice is transmitted in the form of voice such as voice, a voice recognition module may be provided to recognize voice information; when transmitting in a visual form such as a gesture, an image recognition module may be provided to recognize the motion pattern; when the motion or the like can be transmitted in a sensing form through the sensor, the sensor can be arranged, for example, a vibration sensor senses the touch of a human body on the electronic equipment to identify a motion mode; when the video playing control information is sent in a form that can be sensed by human physiological indexes and the like, for example, the video playing control information is human heart rate, pulse beating frequency, respiration rate, body temperature value, blood pressure and the like, correspondingly, a heart rate sensor, a pulse sensor, a respiration sensor, an electronic thermometer, a sphygmomanometer and the like can be adopted for collection, and the electric signal is sent as the video playing control information. When the brain wave is transmitted in the form of energy such as brain waves, the brain wave sensor can be used for collecting the brain wave information of the human body, and particularly, the brain wave analysis module can be arranged for analyzing the brain wave information.
The receiving of the video playing control information may be passively receiving information sent by other devices, or actively acquiring or acquiring the video playing control information.
When the method is applied, a specified video file can be selected from a video file library to be played, and the video file library can be pre-stored in a local place or an internet cloud and the like to be obtained.
And step 102, inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file.
The video control file stores mapping relation between video playing control information and time points on a video playing time axis.
The video playing control information may be composed of arabic numeral codes, graphic symbols (including action graphics, such as fist, clapping, and the like), english letters, chinese characters, and other languages that can be used for marking. For example, the video playback control information is 001, 002 … …; or is… …, respectively; or a, b … …; or the following steps: skipping over leader, jumping to female leading corner and coming out … …; the above examples are only for assisting understanding of the specific expression concept of the video playing control information of the present embodiment, and since they cannot be exhaustive, those skilled in the art may derive more specific expressions of the video playing control information according to the technical concept of the present embodiment.
The mapping relationship between the video playing control information and the time point on the video playing time axis can be stored in a table form, and can also be stored in a program form.
In an optional embodiment, the video control file further stores a mapping relationship between the video playing control information and other playing actions that the video can execute, for example, the video playing control information respectively corresponds to a video pause action, a volume up action, a volume down action, and the like.
Step 103, sending the first time point to a video playing module, so that the video playing module jumps to the first time point from the current playing time point and plays the video currently played.
The first time point can be sent to the time axis control plug-in, so that the time axis control plug-in controls the video played by the video playing module to jump to the first time point from the current playing time point to play the corresponding video content. Therefore, the playing time point corresponding to the video content to be watched can be conveniently and quickly positioned without manually dragging the video progress bar and searching point by point to position the video playing content
The embodiment of the invention provides a video playing progress control method, which comprises the steps of receiving video playing control information; inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; and sending the first time point to a video playing module so that the video playing module jumps to the first time point from the current playing time point and plays the video which is played currently. Therefore, the playing time point corresponding to the video content to be watched can be conveniently and quickly located without manually dragging the video progress bar and searching point by point to locate the video playing content, and therefore the user experience in different application scenes can be improved.
The application scene can be a video editing scene, a video playing scene, a human-computer interaction scene and the like. To help understanding, for example, in a video editing application scene, if an editor wants to clip the content before a certain video frame, in the prior art, the editor needs to find the corresponding video frame by dragging the playing progress bar in the video playing window to find the position point by point. If the scheme of the embodiment of the invention is adopted, only video playing control information, such as voice control information, needs to be input, when the video control file is arranged in the video player, the video player searches a corresponding first playing time point from the video control file based on the video playing control information, and sends the first playing time point to the time axis control plug-in or the time axis control plug-in to actively acquire the first playing time point, the time axis control plug-in automatically jumps the currently played video to the first playing time point based on the playing time point, and a video editor does not need to manually search, so that the time point corresponding to the video frame to be clipped can be quickly and automatically located, and the video clipping process is simple and efficient. Similarly, the concept of the implementation scheme in video playing and other scenes is basically the same, and is not illustrated here.
In this embodiment, the video control file may be integrated into a video player having a video playing module, and may also be stored in a separate control device, so as to enhance the interest of human-computer interaction, a method for generating a video control file with simple operation is also provided, which can reduce the operation difficulty of generating the video control file, and a user may simply and conveniently implement personalized customization of the video control file. For a specific method for generating the video control file, reference may be made to the description of embodiment two.
FIG. 2 is a schematic diagram illustrating a video playing progress control process based on voice control information according to an embodiment of the present invention; FIG. 3 is a signaling diagram of a video playing progress control process based on voice control information according to an embodiment of the present invention; referring to fig. 2 and 3, specifically, when the video playing control information is voice control information; the receiving video playing control information includes:
receiving voice control information input by a user; and identifying the voice control information to obtain a command keyword in the voice control information.
The command keyword may be a number, a letter, and/or a chinese character, etc., and the command keyword shown in fig. 2 is a chinese character, but may be in other representation forms, such as 001, 002, etc.
The querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the command keyword, and acquiring a first time point on a video playing time axis corresponding to the voice control information from the video control file.
In the mode of inquiring the video control file, in some cases, complete matching can be carried out according to the command keywords; for example, the matching method is suitable for complete matching under the condition that the voice control information input by the user is exactly consistent with the voice control information stored in the video control file, and can improve the matching efficiency and accuracy.
In other cases, fuzzy matching can be carried out according to the command keywords; for example, the voice control information input by the user is not completely consistent with the voice control information stored in the video control file, and the voice control information stored in the video control file is exemplarily: the 'jumping to the leading role of the woman's appearance ', the keyword of the command input by the user is' leading role of the woman's appearance', and the method is suitable for fuzzy matching, so that the voice input content of the user can be simplified on the basis of realizing the preset control purpose.
The video control file shown in fig. 2 includes a mapping relationship between the command keyword and the playing time point information; a specific example of the case where the video control file is voice control information is shown in table 1.
User input of voice control information Player execution action
Skipping over the beginning Jump to time 1
Jumping to the leading corner and leaving the scene Jump to time 2
Climax Jump to time 3
Run away section Jump to time 4
Jump to the ending part Jump to time 5
TABLE 1
In some embodiments, the video control file may further include a mapping relationship between the voice control information and other actions performed by the player, for example, a mapping relationship between the voice control information and pause, and another specific example when the video control file is the voice control information is shown in table 2:
user input of voice control information Player execution action
Skipping over the beginning Jump to time 1
Jumping to the leading corner and leaving the scene Jump to time 2
Climax Jump to time 3
Run away section Jump to time 4
Jump to the ending part Jump to time 5
Play back Play back
Pausing Pausing
Fast forward Fast forward for 10 seconds
Fast backing Fast backward for 10 seconds
High sound The pitch of sound is 10%
Low sound The sound is 10 percent lower
TABLE 2
In addition, as can be seen from the foregoing description, the voice control information may also be represented by numbers, letters, and the like, which are only used for facilitating understanding and are not to be considered as exclusive limitations.
The video playing progress control method provided by the embodiment of the invention controls the video playing progress based on the voice control information, and can conveniently, quickly and accurately automatically position the corresponding video frame to be played at the playing time corresponding to the video frame; according to the video control file generation method mentioned above, a specific mapping relationship in the video control file can be customized (specifically, the video control file generation method based on the voice control video playing progress is described in the second embodiment and fig. 10A), and in some application scenarios, the interestingness of control can be increased; for example, what a singer sings is played in a video, a song that the singer is singing in the learning video, such as "you ever", is played by a user, and the voice control information included in the video control file generated according to the video control file generation method is expressed by the lyrics of the song, for example, "once wonderful to live up a sword and go a skyline", and the corresponding playing time point is 0: 10s, the video control file comprises the mapping relation between the lyrics of the song and the playing time point corresponding to the part of the song played to the corresponding lyrics; see in particular table 3. Therefore, the user can quickly position the playing time point of the corresponding lyric to play by inputting the corresponding lyric through voice, the user can learn singing of the corresponding section, the user can conveniently learn and listen to the lyric, and the learning interest can be enhanced through the learning mode.
Figure BDA0002221667120000121
TABLE 3
It should be understood that the technical effects and the schemes of the embodiments of the present invention are illustrated only by learning songs, and the lines of characters in a video may be used as voice control information in other video application scenarios, so as to enhance the interest and convenience of human-computer interaction and learning. The method can be particularly used for quickly finding the corresponding segment in scenes such as VR (virtual reality) playing, teaching, video segment commercial display or application and the like. Based on the conception of the invention, more application examples can be derived and are within the protection scope of the invention.
FIG. 4 is a schematic diagram illustrating a video playing progress control process based on motion control information according to an embodiment of the present invention; FIG. 5 is a signaling diagram of a video playing progress control process based on motion control information according to an embodiment of the present invention; referring to fig. 4 and 5, in an embodiment of the present invention, when the video playback control information is motion control information, the receiving the video playback control information includes:
and receiving the action sensing signal collected by the sensor.
In one embodiment, the sensor is arranged on the limb of the video viewer and is used for acquiring the action mode of the video viewer; the motion sensing signal is triggered by a limb motion of a video viewer. For example, a sensor such as a gyroscope or an accelerometer is attached to a movable part of a human body, and an operation pattern such as rotation or movement of the human body can be acquired.
In another embodiment, the sensor is an image sensor, and the sensor is disposed at a position where a user can be conveniently captured, for example, the sensor is mounted on a video player, or is disposed separately from the video player, and is used for acquiring an action mode of the user or an electronic device used by the user, and the sensor communicates with the video player through a cable or bluetooth, a local area network, or the like;
or the sensor is arranged on electronic equipment, the electronic equipment comprises adult equipment, a human body model, a robot and the like, and the motion sensing signal is triggered by the motion of the sensor when a video viewer touches the electronic equipment. For example, when the electronic device is a mannequin for acupuncture, a plurality of pressure sensors are longitudinally arranged at each acupuncture point of the mannequin, and when the acupuncture is performed, motion mode motion sensing signals collected by the pressure sensors are triggered by the motion of a user touching the sensors of the electronic device and collected; and using the motion sensing signal of the motion mode as video playing control information.
When the electronic equipment is an airplane cup, two or more pressure sensors are arranged on the inner wall of the airplane cup at intervals in the axial direction, when a human organ contacts with a first pressure sensor close to the mouth of the airplane cup, an action sensing signal can be generated, when a second pressure sensor is contacted, a second action sensing signal can be generated, according to the interval of the front signal and the rear signal, an action mode of a user can be determined, for example, linear piston motion and motion rhythm (speed), and the action mode is used as video playing control information.
When the electronic equipment is an adult product simulating a doll, a vibration sensor is arranged at a corresponding part of the electronic equipment, and when the corresponding part is flapped, the vibration sensor is triggered to acquire a motion sensing signal of a flapping motion mode to be used as video playing control information.
The above examples are provided to aid understanding of the embodiments of the present invention, and should not be construed as limiting the technical solutions of the embodiments.
The querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes: and inquiring a video control file corresponding to the video according to the action sensing signal, and acquiring a first time point on a video playing time axis corresponding to the action sensing signal from the video control file.
In some embodiments, the video control file comprises a mapping relationship between motion patterns and video playing time points; the action mode can be customized according to the video control file generation method. To help understand the video play progress control scheme of the present embodiment, an exemplary video control file is shown in fig. 6.
In the foregoing embodiment, after the action mode is identified based on the action sensing signal sent by the sensor, the corresponding play time point can be obtained by querying the video control file based on the action mode, and the play time point is sent to the video play module, so that the corresponding content is played by jumping to the play time point.
In other embodiments, the video control file may also be as shown in FIG. 7.
In this embodiment, after the action mode is identified based on the action sensing signal sent by the sensor, a mapping relation table between a preset action mode and a command keyword may be queried based on the action mode; determining a command keyword corresponding to the action mode, querying the video control file based on the command keyword to obtain a corresponding playing time point, sending the playing time point to a video playing module, and skipping to the playing time to play the corresponding content.
The video playing progress control method provided by the embodiment of the invention controls the video playing progress based on the action control information, and can conveniently, quickly and accurately automatically position the corresponding video frame to be played at the playing time corresponding to the video frame; according to the video control file generation method mentioned above, a specific mapping relationship in the video control file can be customized (specifically, the video control file generation method based on the voice control video playing progress is described in the second embodiment and fig. 10B), and in some application scenarios, the control interestingness can be increased; for example, referring to fig. 4, what is shown is the actions of going out, jumping and running corresponding to the leading corners of the time points in the video, when the user watches the video, if the user wants to skip the action of going out and directly watch the action of jumping, the user only needs to make the corresponding jumping action, the sensor detects and identifies the action mode in the action control information of the user, queries the video control file based on the action mode, because the mapping relationship between the action control information (or the action mode) and the playing time points is preset in the video control file, the playing time point corresponding to the action mode can be obtained and sent to the video playing module, and the video playing module jumps to the corresponding playing time to play the video frame of the jumping content. Therefore, the content of the video frame to be watched can be quickly positioned, and the interestingness of control can be enhanced.
With the development of medical science, many physiological indexes of a human body become quantifiable, for example, the heart rate, pulse rate, respiratory rate, body temperature value, blood pressure and other physiological indexes of the human body can be measured, and with the development of application of computer science in medical science, the combination of hardware and software used for measurement realizes intelligent monitoring and output of the physiological indexes of the human body.
At present, the application of the physiological indexes of the human body is basically limited to the prevention and the examination of the human body health, other new application cases are few, if the measurable physiological indexes can be combined with other technologies, a new application field is developed, better technical achievements are obtained, and the method has significance for interdisciplinary research and development. In one embodiment of the application, the human body physiological indexes are creatively applied to the technical field of control, and the control on video playing is realized.
Specifically, the video playing control information is human body physiological index control information.
The receiving video playing control information includes:
receiving a physiological index sensing signal acquired by a sensor; the sensor is arranged on the body of a video viewer, and the physiological sensing signal is triggered by the change of the physiological index of the video viewer; the physiological indexes comprise the respiratory frequency, the blood pressure, the pulse beating rate, the heart rate, the body temperature and the like of the human body; the specific acquisition method is, for example, to wear a respiration sensor on a human body for acquiring the respiratory frequency of the human; the sphygmomanometer is worn on a human body and is used for collecting the blood pressure of the human body; collecting the temperature of a human body by using a thermometer and the like; but also through smart wearable devices, such as smart watches.
Analyzing the physiological index sensing signal to obtain a physiological index value; and determining the number of the threshold range interval in which the physiological index value is positioned.
In this embodiment, in order to control video playing based on physiological indicators, the video control file specifically stores: the corresponding relation between the labels of the multiple threshold range intervals of the physiological indexes and the video playing time points; therefore, after the threshold range interval number where the physiological index value is located is determined, the corresponding playing time point can be obtained by inquiring the video control file, and the video playing progress can be controlled.
Specifically, the querying a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file includes:
and inquiring a video control file corresponding to the video according to the threshold range interval number of the physiological index sensing signal, and acquiring a first time point on a video playing time axis corresponding to the threshold range interval number of the physiological index sensing signal from the video control file.
In the embodiment of the invention, the human body physiological indexes are used as control information, and the video frame content to be watched can be quickly positioned through the steps.
In addition, the video playing progress is controlled by taking the human body physiological indexes as control information, and the video can be controlled differently according to different people in some application scenes because the physiological indexes of different people are different when the different people are subjected to the same stimulus. For example, when a patient is treated by using the somatosensory music therapy, a specially-made somatosensory music video is played, and the physiological indexes, such as heartbeat, blood pressure and the like, of the patient can change at a preset frequency along with the vibration of the music when the patient listens to the somatosensory music, so that the physiological and psychological adjustment and treatment of the patient are realized. In the treatment process, in order to automatically control the music playing section according to the physiological change of the patient, the technical scheme provided by the embodiment can be adopted, and a heart rate sensor and a sphygmomanometer are arranged on the patient, or wearable equipment capable of measuring the heartbeat and the blood pressure of the human body, such as a smart watch, is worn on the patient so as to monitor the physiological index change of the human body; when the human body blood pressure value is monitored to be high, the human body blood pressure value exceeds the current music control range, for example, the blood pressure of a current patient is high, the blood pressure is hopefully controlled within a preset range by playing some relaxing music through music somatosensory therapy, when the 20 th music is played, the vibration frequency is high, the blood pressure and heartbeat indexes of the patient are promoted to be accelerated, the blood pressure is not controlled within the preset range, when the blood pressure and heartbeat index values at the moment are collected, the control signals are sent to a controller or a video player with the controller, after the controller receives the physiological index sensing signals collected by a sensor, the physiological index sensing signals are analyzed by the controller to obtain physiological index values, and the threshold range section number where the physiological index values such as the blood pressure and the heartbeat are located is determined; and inquiring a video control file corresponding to the played specially-made music video according to the threshold range interval number of the physiological index sensing signal, and acquiring a first time point on a video playing time axis corresponding to the threshold range interval number of the physiological index sensing signal from the video control file, so that music video clips matched with current physiological treatment such as blood pressure, heartbeat and the like can be quickly found for playing, and the blood pressure, the heartbeat and the like of the patient can be controlled to be adjusted to be within a preset range.
It should be understood that the foregoing examples are provided to help understand technical solutions of the embodiments of the present invention, and are not to be construed as limiting specific application scenarios of the embodiments of the present invention.
With the development of science and technology, the research on the human neurology department is more and more deep, and the brain wave sensor is generated at the same time, so that the brain wave sensor provides material support for researching the activity of the human brain. In one embodiment of the present invention, the video playing control information is brain wave control information; the receiving video playing control information includes: receiving brain waves of a user.
The micro-current sensor can be buried under the cerebral cortex or attached to the scalp to detect micro-current changes generated during brain activity, the micro-magnetic sensor can detect magnetic field changes generated by brain current, the micro-magnetic sensor can be operated in a non-contact mode and has a wider application prospect, wherein brain waves comprise various types of waves, the frequency of each type of wave is different, the waves generally used for researching the brain activity of a human are α waves, β waves, gamma waves and the like, and the brain wave sensor can send brain wave control information to the control module through Bluetooth and the like.
Analyzing the brain waves to obtain the attention concentration reading; determining a threshold range interval number for the concentration reading.
The range of the concentration ratio is 0-100, the threshold range interval of the concentration ratio is different in number, the concentration ratios are different, for example, 0-20 usually indicates that the concentration is not concentrated and the patient is in a vexation state; 20-40 means slight inattention and slight distraction; 40-60 means that the attention is in a general concentration state, and more than 60 means that the attention is concentrated.
It will be appreciated that when the user is watching a video, the more his concentration is detected, indicating that he is more interested in the currently played content, and accordingly no or substantially no fast forward is required, whereas when a lower concentration is detected, the video fast forward is automatically controlled.
The querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the number of the threshold range interval where the attention concentration reading is located, and acquiring a first time point on a video playing time axis corresponding to the number of the threshold range interval where the attention concentration reading is located from the video control file.
In the embodiment, the attention concentration of the user is detected, the video playing module can be controlled to automatically jump to the video playing time point according to the change of the attention of the user, the control mode is pioneering, and the interestingness of man-machine interaction can be enhanced in use.
Example two
In the first embodiment, the video control file may be integrated in a video player having a video playing module, and may also be stored in a separate control device, so as to enhance the interest of human-computer interaction, a method for generating a video control file with simple operation is provided, which may reduce the operation difficulty of generating the video control file, and a user may simply and conveniently implement personalized customization of the video control file.
Referring to fig. 8 to 10B, the video control file generation method is applied to a first electronic device with a video player, where the first electronic device may be a mobile phone, a computer, a tablet pad, VR glasses, a holographic video projector, a television, an MP4, and the like, and the video player is integrated with a data file reading plug-in or integrated with a timeline control plug-in; the method comprises the following steps:
201. and playing the specified video file. The video file may be pre-stored in a local or cloud location, or may be acquired from the internet, and the content played by the video file is not specifically limited.
202. A selected operation for a first point in time on a timeline of the video file is received.
When the video file is played on the player, a video playing time progress bar is arranged on the playing window and used for recording video playing time progress information, and a user can drag the video playing time progress bar or click a certain time point position of the video playing time progress bar according to needs so that the player can directly play video frames corresponding to the time point.
In this embodiment, specifically, the receiving a selection operation of a first time point on a time axis of the video file includes: and after detecting a first button triggering event, determining the triggering time of the first button as the selected operation of a first time point on the time axis of the video file. The first button is a function key for generating video playing control information; for example, when a video file is being played by a player, or when a user adjusts to a certain time point through a play button and a video play time progress bar, the content played at the current time point of the video needs to be used as the content played at the play time point corresponding to the video play control information, and then the user, for example, an editor may press the first button, and when the electronic device detects a trigger event of the first button, the current play time point of the video play time progress bar is selected and recorded while the first button is triggered.
203. And associating the first time point with video playing control information to generate the video control file.
In this embodiment, the selected first time point is associated with one motion code and then stored as a video control file; the video control file comprises a video playing progress time point and a video playing control information association file; repeating the above operations to obtain a plurality of associated files of video playing progress time points and video playing control information, so as to generate a mapping relationship corresponding to the plurality of video playing control information and the plurality of playing time points in the control file, and certainly, the video control file may further include other information, for example, a mapping relationship corresponding to the video playing control information and playing, pausing, fast forwarding, rewinding, volume increasing or volume decreasing, and the like.
According to the method for generating the video control file, provided by the embodiment of the invention, a user can select the first time point on the time axis of the video file according to the played video file, so that the current video playing time point can be obtained, and the first time point is associated with one piece of video playing control information, so that the video control file can be generated. The whole process for generating the control file is simple to operate. Moreover, the video control file generated according to the method for generating a video control file provided by the embodiment is stored in a video control device or a video player or other devices, the video playing progress is controlled, and the playing time point corresponding to the video content to be watched can be simply, conveniently and quickly located, so that the user experience can be improved.
For the convenience of the reader to clearly understand the technical scheme and the technical effect of the invention, the following detailed description is given in conjunction with a specific process for generating a video control file on a mobile phone:
the mobile phone has the functions of playing videos and editing. Referring to fig. 2 or fig. 4, in the video playing process, the actions of the character and after the character comes out and the corresponding playing time point are shown, and in order to achieve the purpose of controlling the subsequent video playing progress, specifically, in order to conveniently and quickly locate the time point corresponding to the video frame to be played for playing, a video control file needs to be prepared in advance. In the prior art, a general video control file needs to be programmed by a technician with professional programming knowledge, a general editor does not know the programming knowledge and cannot easily obtain the control file through programming, and the video playing progress control scheme is not suitable for popularization among general user groups. If the control file generation scheme provided by the embodiment of the present invention is adopted, in the process of watching a video, an editor only needs to select a playing time point of a jump of a person in the video, for example, 15s, the mobile phone obtains that the playing time point corresponding to the jump of the person in the current video is the 15 th s, and after obtaining the playing time point, associates the playing time point with video playing control information, which is, for example, the code formed by the aforementioned numbers, the command word formed by chinese characters, or a series of action symbols formed by actions, and the like, so that a video control file including control video playing control information associated with the playing time point of the video can be generated and stored as video playing control information (such as 123 or the jump of the person): 15 s. And sequentially selecting the playing time points corresponding to other video frames of the characters played in the video, and generating a video control file containing a plurality of pieces of control video playing control information and associated with the plurality of playing time points of the video. The method for generating the video control file is simple to operate, a common user can edit the video control file, personalized customization of the video control file can be realized, the video playing progress is controlled based on the personalized customized video control file, and interestingness of video playing progress control is enhanced; the video playing progress control scheme provided by the embodiment of the invention is beneficial to popularization and dissemination in various application scenes.
FIG. 9A is a schematic diagram of an embodiment of a graphical user interface of a first electronic device with a video playback window according to the present invention; referring to fig. 3, in an embodiment of the present invention, the associating the first time point with video playing control information, and the generating a video control file includes: after a first button triggering event is detected, acquiring video playing control information corresponding to a first button based on a first mapping relation table of preset buttons and the video playing control information; and associating the first time point with the acquired video playing control information to generate a video control file.
The first electronic device may be preset with a plurality of first buttons, each of the first buttons corresponds to one piece of video playing control information, and each piece of video playing control information corresponds to one playing time point. According to the content played by the video at a certain point, the user only needs to press the corresponding button to automatically realize the editing and the generation of the video control file, the operation is simple, and the difficulty of the editing and the generation of the video control file is reduced.
Referring to fig. 9B and 9C, in another embodiment of the present invention, a dialog window pops up after the first button triggering event is detected; a text input box and/or a voice dialog box are/is arranged in the dialog window; receiving video playing control information input by a user through the conversation window; and associating the first time point with the video playing control information input by the user to generate a video control file.
In another embodiment of the present invention, after associating the first time point with video playback control information and generating a video control file, the method further comprises: storing the video control file locally in the first electronic equipment; or, packaging the video control file into a video file; or storing the video control file in a cloud server.
In order to facilitate understanding of the present invention, a technical solution of the embodiment of the present invention is described in detail with reference to a specific application scenario:
referring to fig. 9A to 9D, assuming that the gui with video playing function of the electronic device in fig. 9D is a client of computer software, which integrates a video playing module, robotic arm action editor buttons ① and ②, a timeline controller, and a robotic arm action simulation demonstrator (i.e. a playing window), it should be noted that the gui shown in fig. 9A, 9B, and 9C can also achieve the technical effects of the embodiments of the present invention, and the manner of inputting video playing control information is different, and the basic concept is the same.
The video control file generation process comprises the following steps:
in this case, an editor selects a section of video with two girl stations standing on the two sides of the royal and having the royal back pounding positions through a computer client, clicks a play button to play the video, and plays the video according to a play time point corresponding to the back pounding motion in the video, for example, the first time of the girl pounding (which can be used as motion control information) corresponding to the play time point 00: 15s, the button ① corresponding to the motion control information 1, the second time of the girl pounding, the corresponding play time point 00: 30s, and the buttonmotion control information 2 based on the progress of the video play.
When a user watches videos, an editor presses a button ① when a first back-pounding action of a palace woman occurs, and presses a button ② when a second back-pounding action of the palace woman occurs, and when the editor presses the button ① or the button ② once, the robot action editor automatically generates a mapping relation with a video playing time point label and video playing control information (action control information), stores the mapping relation in a control file as a video control file, and stores the video control file in the robot local or a cloud server.
The video playing progress control process comprises the following steps:
in a video editing scene, a robot action simulation demonstration window plays a video with a back beaten by a mechanical arm, and the video is referred by an editor.
By adopting the technical scheme of the embodiment of the invention, the back pounding action corresponding to the robot can be collected by adopting an image sensor or other sensors capable of recognizing the action, the back pounding action is used as control information and is sent to the video playing module, the video playing module inquires the video control file based on the action control information, determines the corresponding playing time point, and jumps to the corresponding playing time point for playing, so that the time point corresponding to the video frame to be played can be simply and quickly positioned for playing.
EXAMPLE III
Fig. 11 is a block diagram of a video playing progress control device according to an embodiment of the present invention, and referring to fig. 11, the video playing progress control device includes: a receiving module 111, configured to receive video playing control information; a query obtaining module 112, configured to query, according to the video playing control information, a video control file corresponding to the video, and obtain, from the video control file, a first time point on a video playing time axis corresponding to the video playing control information; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis; a sending module 113, configured to send the first time point to a video playing module, so that the video playing module jumps to the first time point from the current playing time point and plays the currently played video.
An embodiment of the present invention provides an electronic device, including: the device comprises a receiving module, an inquiry acquisition module and a sending module, wherein the receiving module is used for receiving video playing control information; the query acquisition module queries a video control file corresponding to the video according to the video playing control information, and acquires a first time point on a video playing time axis corresponding to the video playing control information from the video control file; and the sending module sends the first time point to a video playing module so that the video playing module skips the currently played video from the currently played time point to the first time point and plays the video. Therefore, the playing time point corresponding to the video content to be watched can be conveniently and quickly positioned without manually dragging the video progress bar and searching point by point to position the video playing content, so that the use of watching the video, editing the video and the like in different application scenes can be improved
In an embodiment of the present invention, the video playing control information is voice control information;
the receiving module includes:
the first receiving unit is used for receiving voice control information input by a user;
the voice recognition unit is used for recognizing the voice control information to obtain a command keyword in the voice control information;
the query obtaining module is specifically configured to query a video control file corresponding to the video according to the command keyword, and obtain a first time point on a video playing time axis corresponding to the voice control information from the video control file.
In another embodiment of the present invention, the video playing control information is motion control information;
the receiving module includes:
the second receiving unit is used for receiving the action sensing signals collected by the sensor; the sensor is arranged on the limb of a video viewer, and the motion sensing signal is triggered by the limb motion of the video viewer, or the sensor is arranged on the electronic equipment, and the motion sensing signal is triggered by the motion of the video viewer touching the sensor of the electronic equipment;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the motion sensing signal, and acquire a first time point on a video playing time axis corresponding to the motion sensing signal from the video control file.
In another embodiment of the present invention, the video playing control information is brain wave control information;
the receiving module includes:
a fourth receiving unit for receiving brain waves of the user;
the brain wave analysis unit is used for analyzing the brain waves to obtain the attention concentration reading;
a second determination unit that determines a threshold range section number of the attention concentration reading;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the threshold range interval number where the attention concentration reading is located, and acquire a first time point on a video playing time axis corresponding to the threshold range interval number where the attention concentration reading is located from the video control file.
The method and the device for controlling the video playing progress, provided by the embodiment of the invention, are suitable for editing, controlling and other applications of multimedia files such as audio, video and the like, for example, playing control of video in teaching, playing control of video in watching movies, playing control in commercial advertisement display, fast searching of a certain video frame when an editor makes video, interactive control of people or equipment and video and the like; the video frame needing to be displayed can be quickly located and found, the task amount of manually controlling the video playing progress is reduced, and both hands of people are liberated.
In addition, it should be noted that, the methods and apparatuses of the foregoing embodiments are based on the same inventive concept, and the implementation schemes and technical effects of the embodiments are similar, and they can be referred to each other.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof.
In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
The embodiment of the invention also provides electronic equipment, and the electronic equipment comprises the device in any one of the embodiments.
Fig. 12 is a schematic structural diagram of an embodiment of an electronic device of the present invention, which may implement a flow of any one of the embodiments of the first and 2 embodiments of the present invention, and as shown in fig. 12, the electronic device may include: the device comprises a shell 41, a processor 42, a memory 43, a circuit board 44 and a power circuit 45, wherein the circuit board 44 is arranged inside a space enclosed by the shell 41, and the processor 42 and the memory 43 are arranged on the circuit board 44; a power supply circuit 45 for supplying power to each circuit or device of the electronic apparatus; the memory 43 is used for storing executable program code; the processor 42 executes a program corresponding to the executable program code by reading the executable program code stored in the memory 43, and is configured to execute the video playing progress control method or the video control file generation method according to any one of the foregoing embodiments.
For the specific execution process of the above steps by the processor 42 and the steps further executed by the processor 42 by running the executable program code, reference may be made to the description in the first and second embodiments of the present invention, which is not described herein again.
The electronic device exists in a variety of forms, including but not limited to:
(1) a mobile communication device: such devices are characterized by mobile communications capabilities and are primarily targeted at providing voice, data communications. Such terminals include: smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, among others.
(2) Ultra mobile personal computer device: the equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include: PDA, MID, and UMPC devices, etc., such as ipads.
(3) A portable entertainment device: such devices can display and play multimedia content. This type of device comprises: audio, video players (e.g., ipods), handheld game consoles, electronic books, and smart toys and portable car navigation devices.
(4) A server: the device for providing the computing service comprises a processor, a hard disk, a memory, a system bus and the like, and the server is similar to a general computer architecture, but has higher requirements on processing capacity, stability, reliability, safety, expandability, manageability and the like because of the need of providing high-reliability service.
(5) And other electronic equipment with data interaction function.
It is further noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments.
In particular, for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
For convenience of description, the above devices are described separately in terms of functional division into various units/modules. Of course, the functionality of the units/modules may be implemented in one or more software and/or hardware implementations of the invention.
From the above description of the embodiments, it is clear to those skilled in the art that the present invention can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. A video playing progress control method is characterized by comprising the following steps:
receiving video playing control information;
inquiring a video control file corresponding to the video according to the video playing control information, and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis;
and sending the first time point to a video playing module so that the video playing module jumps to the first time point from the current playing time point and plays the video which is played currently.
2. The video playback progress control method according to claim 1, wherein the video playback control information is voice control information;
the receiving video playing control information includes:
receiving voice control information input by a user;
recognizing the voice control information to obtain a command keyword in the voice control information;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the command keyword, and acquiring a first time point on a video playing time axis corresponding to the voice control information from the video control file.
3. The video playback progress control method according to claim 1, wherein the video playback control information is action control information;
the receiving video playing control information includes:
receiving an action sensing signal acquired by a sensor; the sensor is arranged on the limb of a video viewer, and the motion sensing signal is triggered by the limb motion of the video viewer, or the sensor is arranged on the electronic equipment, and the motion sensing signal is triggered by the body of the video viewer contacting the sensor of the electronic equipment;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the action sensing signal, and acquiring a first time point on a video playing time axis corresponding to the action sensing signal from the video control file.
4. The video playback progress control method according to claim 1, wherein the video playback control information is human physiological index control information;
the receiving video playing control information includes:
receiving a physiological index sensing signal acquired by a sensor; the sensor is arranged on the body of a video viewer, and the physiological sensing signal is triggered by the change of the physiological index of the video viewer;
analyzing the physiological index sensing signal to obtain a physiological index value;
determining the number of the threshold range interval where the physiological index value is located;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the threshold range interval number of the physiological index sensing signal, and acquiring a first time point on a video playing time axis corresponding to the threshold range interval number of the physiological index sensing signal from the video control file.
5. The video playback progress control method according to claim 1, characterized in that the video playback control information is brain wave control information;
the receiving video playing control information includes:
receiving brain waves of a user;
analyzing the brain waves to obtain the attention concentration reading;
determining a threshold range interval number in which the concentration reading is located;
the querying, according to the video playing control information, a video control file corresponding to the video, and obtaining a first time point on a video playing time axis corresponding to the video playing control information from the video control file, includes:
and inquiring a video control file corresponding to the video according to the number of the threshold range interval where the attention concentration reading is located, and acquiring a first time point on a video playing time axis corresponding to the number of the threshold range interval where the attention concentration reading is located from the video control file.
6. A video playback progress control apparatus, characterized in that the apparatus comprises:
the receiving module is used for receiving video playing control information;
the query acquisition module is used for querying a video control file corresponding to the video according to the video playing control information and acquiring a first time point on a video playing time axis corresponding to the video playing control information from the video control file; the video control file stores the mapping relation between the video playing control information and the time point on the video playing time axis;
and the sending module is used for sending the first time point to the video playing module so that the video playing module skips the currently played video from the current playing time point to the first time point and plays the video.
7. The apparatus of claim 6, wherein the video playback control information is voice control information;
the receiving module includes:
the first receiving unit is used for receiving voice control information input by a user;
the voice recognition unit is used for recognizing the voice control information to obtain a command keyword in the voice control information;
the query obtaining module is specifically configured to query a video control file corresponding to the video according to the command keyword, and obtain a first time point on a video playing time axis corresponding to the voice control information from the video control file.
8. The generation method according to claim 6, wherein the video playback control information is motion control information;
the receiving module includes:
the second receiving unit is used for receiving the action sensing signals collected by the sensor; the sensor is arranged on the limb of a video viewer, and the motion sensing signal is triggered by the limb motion of the video viewer, or the sensor is arranged on the electronic equipment, and the motion sensing signal is triggered by the motion of the video viewer touching the sensor of the electronic equipment;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the motion sensing signal, and acquire a first time point on a video playing time axis corresponding to the motion sensing signal from the video control file.
9. The generation method according to claim 6, wherein the video playback control information is human physiological index control information;
the receiving module includes:
the third receiving unit is used for receiving the physiological index sensing signal acquired by the sensor; the sensor is arranged on the body of a video viewer, and the physiological sensing signal is triggered by the change of the physiological index of the video viewer;
the physiological index analysis unit is used for analyzing the physiological index sensing signal to obtain a physiological index value;
the first determining unit is used for determining the number of the threshold range interval where the physiological index value is located;
the query module is specifically configured to query a video control file corresponding to the video according to the threshold range interval number where the physiological indicator sensing signal is located, and acquire a first time point on a video playing time axis corresponding to the threshold range interval number where the physiological indicator sensing signal is located from the video control file.
10. The generation method according to claim 6, wherein the video playback control information is brain wave control information;
the receiving module includes:
a fourth receiving unit for receiving brain waves of the user;
the brain wave analysis unit is used for analyzing the brain waves to obtain the attention concentration reading;
a second determination unit that determines a threshold range interval number in which the attention concentration reading is located;
the query acquisition module is specifically configured to query a video control file corresponding to the video according to the threshold range interval number where the attention concentration reading is located, and acquire a first time point on a video playing time axis corresponding to the threshold range interval number where the attention concentration reading is located from the video control file.
CN201910940155.6A 2019-09-29 2019-09-29 Video playing progress control method and device and electronic equipment Pending CN110839180A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910940155.6A CN110839180A (en) 2019-09-29 2019-09-29 Video playing progress control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910940155.6A CN110839180A (en) 2019-09-29 2019-09-29 Video playing progress control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN110839180A true CN110839180A (en) 2020-02-25

Family

ID=69574683

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910940155.6A Pending CN110839180A (en) 2019-09-29 2019-09-29 Video playing progress control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN110839180A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970568A (en) * 2020-08-31 2020-11-20 上海松鼠课堂人工智能科技有限公司 Method and system for interactive video playing
CN112249026A (en) * 2020-10-26 2021-01-22 广州小鹏汽车科技有限公司 Vehicle control method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150080237A (en) * 2013-12-31 2015-07-09 주식회사 케이티 Method and Apparatus for Control of Multimedia Contents Play Based on User Action
EP2899719A1 (en) * 2014-01-27 2015-07-29 Samsung Electronics Co., Ltd Display apparatus for performing voice control and voice controlling method thereof
CN105357585A (en) * 2015-08-29 2016-02-24 华为技术有限公司 Method and device for playing video content at any position and time
CN106527683A (en) * 2016-09-29 2017-03-22 珠海格力电器股份有限公司 Apparatus and method for controlling play of media player, head-mounted apparatus and system
CN107613399A (en) * 2017-09-15 2018-01-19 广东小天才科技有限公司 A kind of video fixed-time control method for playing back, device and terminal device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150080237A (en) * 2013-12-31 2015-07-09 주식회사 케이티 Method and Apparatus for Control of Multimedia Contents Play Based on User Action
EP2899719A1 (en) * 2014-01-27 2015-07-29 Samsung Electronics Co., Ltd Display apparatus for performing voice control and voice controlling method thereof
CN105357585A (en) * 2015-08-29 2016-02-24 华为技术有限公司 Method and device for playing video content at any position and time
CN106527683A (en) * 2016-09-29 2017-03-22 珠海格力电器股份有限公司 Apparatus and method for controlling play of media player, head-mounted apparatus and system
CN107613399A (en) * 2017-09-15 2018-01-19 广东小天才科技有限公司 A kind of video fixed-time control method for playing back, device and terminal device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111970568A (en) * 2020-08-31 2020-11-20 上海松鼠课堂人工智能科技有限公司 Method and system for interactive video playing
CN111970568B (en) * 2020-08-31 2021-07-16 上海松鼠课堂人工智能科技有限公司 Method and system for interactive video playing
CN112249026A (en) * 2020-10-26 2021-01-22 广州小鹏汽车科技有限公司 Vehicle control method and device

Similar Documents

Publication Publication Date Title
Aranha et al. Adapting software with affective computing: a systematic review
JP6125670B2 (en) Brain-computer interface (BCI) system based on temporal and spatial patterns of collected biophysical signals
US10271783B2 (en) Stimulus presenting system, stimulus presenting method, computer, and control method
Soleymani et al. A multimodal database for affect recognition and implicit tagging
Chaturvedi et al. Music mood and human emotion recognition based on physiological signals: a systematic review
US11324436B2 (en) Knowledge discovery based on brainwave response to external stimulation
Liu et al. EEG-based dominance level recognition for emotion-enabled interaction
WO2019040524A1 (en) Method and system for musical communication
US20130120114A1 (en) Biofeedback control system and method for human-machine interface
Jones et al. Biometric valence and arousal recognition
CN110839180A (en) Video playing progress control method and device and electronic equipment
Morgan et al. Using affective and behavioural sensors to explore aspects of collaborative music making
Rostami et al. Bio-sensed and embodied participation in interactive performance
KR102153606B1 (en) Apparatus and method for estimating user fatigue degree of video content
van den Broek et al. Unobtrusive sensing of emotions (USE)
Cosentino et al. Natural human–robot musical interaction: understanding the music conductor gestures by using the WB-4 inertial measurement system
Schuller Acquisition of affect
CN108646918A (en) Visual interactive method and system based on visual human
Shin et al. MyMusicShuffler: Mood-based music recommendation with the practical usage of brainwave signals
CN109300522A (en) A kind of workout data method for pushing, system and server
CN108416004A (en) electronic equipment, music control method and related product
KR20150105514A (en) Method for service based on bio-signal and mobile device and computer readable recording medium applying the same
CN104936514B (en) Bioinformation detecting device and Biont information detection method
KR101950598B1 (en) System and method for helping exercise using sensor, recording medium for performing the method
Hamdy et al. Affective games: a multimodal classification system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200225

RJ01 Rejection of invention patent application after publication