CN111510785B - Video playing control method, device, terminal and computer readable storage medium - Google Patents

Video playing control method, device, terminal and computer readable storage medium Download PDF

Info

Publication number
CN111510785B
CN111510785B CN202010302812.7A CN202010302812A CN111510785B CN 111510785 B CN111510785 B CN 111510785B CN 202010302812 A CN202010302812 A CN 202010302812A CN 111510785 B CN111510785 B CN 111510785B
Authority
CN
China
Prior art keywords
terminal
earphone
target image
controlling
video picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010302812.7A
Other languages
Chinese (zh)
Other versions
CN111510785A (en
Inventor
李赟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010302812.7A priority Critical patent/CN111510785B/en
Publication of CN111510785A publication Critical patent/CN111510785A/en
Application granted granted Critical
Publication of CN111510785B publication Critical patent/CN111510785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program

Abstract

The present application belongs to the technical field of video playing, and in particular, to a method, an apparatus, a terminal and a computer-readable storage medium for controlling video playing, wherein the method for controlling video playing includes: detecting the wearing state of an earphone connected with a terminal, and acquiring a target image acquired by a camera of the terminal; carrying out face recognition on the target image to obtain a face recognition result of the target image; and respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image, thereby improving the flexibility of the video playing control of the terminal.

Description

Video playing control method, device, terminal and computer readable storage medium
Technical Field
The present application belongs to the field of video playing technologies, and in particular, to a method, an apparatus, a terminal, and a computer-readable storage medium for controlling video playing.
Background
As the video content provided by the terminals such as the mobile phone and the like for the user becomes richer and richer, more and more users start to watch the video by using the terminals such as the mobile phone and the like, so that the requirements of the users on the video playing function become higher and higher.
However, currently, when a user watches a video using a terminal, the user can only control the playing and pausing of the video through a playing control key of a video playing application, and there is a problem that the flexibility of video playing control is poor.
Disclosure of Invention
The embodiment of the application provides a video playing control method, a video playing control device, a terminal and a computer readable storage medium, which can improve the flexibility of video playing control of the terminal.
A first aspect of an embodiment of the present application provides a method for controlling video playing, including:
detecting the wearing state of an earphone connected with a terminal, and acquiring a target image acquired by a camera of the terminal;
carrying out face recognition on the target image to obtain a face recognition result of the target image;
and respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of an earphone connected with the terminal and the face recognition result of the target image.
A second aspect of the embodiments of the present application provides a control apparatus for video playing, including:
the detection unit is used for detecting the wearing state of an earphone connected with the terminal and acquiring a target image acquired by a camera of the terminal;
the recognition unit is used for carrying out face recognition on the target image to obtain a face recognition result of the target image;
and the control unit is used for respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image.
A third aspect of the embodiments of the present application provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the above method.
In the embodiment of the application, the wearing state of the earphone connected with the terminal is detected, the target image collected by the camera of the terminal is subjected to face recognition, then, the playing and the suspension of the video picture and the audio content of the terminal are respectively controlled according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image, so that the automatic control of the playing and the suspension of the video picture and the audio content of the terminal in the video playing process is realized, and the independent control, rather than the synchronous control, of the playing and the suspension of the video picture and the audio content of the terminal in the video playing process is realized. For example, in the embodiment of the application, the terminal can be controlled to pause playing of a video image and continue playing of audio content according to the wearing state of an earphone connected with the terminal and the face recognition result of the target image; or controlling the terminal to continuously play the video picture and pause the audio content according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image; and the video picture and the audio content can not be controlled to be played simultaneously or be paused to be played simultaneously, so that the flexibility of the video playing control of the terminal can be improved, and meanwhile, the energy consumption of the terminal can be reduced to a certain degree.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flow chart illustrating an implementation process of a control method for video playing provided in an embodiment of the present application;
fig. 2 is a schematic flowchart illustrating a specific implementation of step 103 of a video playing control method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a first specific implementation of step 101 of a video playing control method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a second specific implementation of step 101 of a video playing control method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control apparatus for video playing according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In order to explain the technical means of the present application, the following description will be given by way of specific examples.
At present, when a user watches a video by using a terminal, the user can only control the playing and pausing of the video through a playing control key of a video playing application, and can only control the playing and pausing of a video picture and an audio content corresponding to the video synchronously, that is, only can control the video picture and the audio content corresponding to the video to be played or paused simultaneously, but cannot realize the independent control of the video picture and the audio content corresponding to the video, and the problem of poor flexibility of video playing control exists.
Based on this, embodiments of the present application provide a method, an apparatus, a terminal, and a computer-readable storage medium for controlling video playing, which can improve flexibility of video playing control of a terminal and reduce energy consumption of the terminal.
Fig. 1 shows a schematic flow chart of a first implementation of a method for controlling video playing provided in an embodiment of the present application, where the method may be applied to a terminal in the embodiment of the present application and executed by a control device for video playing configured on the terminal, and may also be applied to an earphone in the embodiment of the present application and executed by a control device for video playing configured on the earphone. In this embodiment, the terminal may be a mobile phone, a tablet computer, or other terminal capable of implementing a video playing function.
Specifically, the control method for playing the video may include steps 101 to 103.
Step 101, detecting a wearing state of an earphone connected with a terminal, and acquiring a target image acquired by a camera of the terminal.
In the embodiment of the application, when the application running on the foreground of the terminal is playing the video and the terminal is connected with the earphone, the wearing state of the earphone connected with the terminal can be detected, the target image collected by the camera of the terminal is obtained, and then the playing and the suspension of the video picture and the audio content of the terminal are respectively controlled according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image, so that the flexibility of the video playing control of the terminal is improved.
In the embodiment of the present application, the wearing state of the headset connected to the terminal may include a normal wearing state and an unworn state. The normal wearing state refers to a state that the earphone is positioned at the position of the ear of the user; the unworn state refers to a state in which the earphone is not positioned at the ear position of the user.
And 102, carrying out face recognition on the target image to obtain a face recognition result of the target image.
In this embodiment of the application, the target image acquired by the camera of the terminal is an image acquired by the camera located on one side of the display screen on the terminal, and whether the user is watching a video picture on the terminal can be determined by performing face recognition on the target image.
Specifically, when a user is watching a video picture on a terminal, the face recognition result of the target image is that the target image contains a target face; when the user does not watch the video picture on the terminal, the face recognition result of the target image is that the target image does not contain the target face.
And 103, respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image.
Generally, in the process of playing a video by a terminal, if the terminal is connected with an earphone, audio content corresponding to the video is output from the earphone end. When the wearing state of the earphone connected with the terminal is the non-wearing state, the earphone indicates that the user does not listen to the audio content corresponding to the video, so that the terminal can be controlled to pause the playing of the audio content at the moment in order to reduce the energy consumption of the terminal. When the wearing state of the earphone connected with the terminal is a normal wearing state, it indicates that the user is listening to the audio content corresponding to the video by using the earphone, and therefore, the terminal needs to be controlled to keep the audio content playing normally.
In addition, when the target image is identified to contain the target face, the user is shown to be watching the video picture on the terminal, so the terminal needs to be controlled to keep the video picture normally played; when the target image is identified to contain the target face, the user is not watching the video picture on the terminal, so that in order to reduce the energy consumption of the terminal, the terminal can be controlled to pause the playing of the video picture.
Therefore, in the embodiment of the application, the wearing state of the earphone connected with the terminal can be detected, the face recognition is performed on the target image acquired by the camera of the terminal, then, the playing and the suspension of the video picture and the audio content of the terminal are respectively controlled according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image, so that the automatic control of the playing and the suspension of the video picture and the audio content of the terminal in the video playing process is realized, the independent control of the playing and the suspension of the video picture and the audio content of the terminal in the video playing process is realized, the flexibility of the video playing control of the terminal is improved, and the energy consumption of the terminal is reduced.
For example, as shown in fig. 2, in the embodiment of the present application, in the step 103, respectively controlling playing and pausing of the video picture and the audio content of the terminal according to the wearing state of the headset connected to the terminal and the face recognition result of the target image may include: step 201 to step 204.
Step 201, if the earphone connected with the terminal is not worn and the target image includes a target face, controlling the terminal to play a video image and controlling the terminal to pause playing the audio content corresponding to the video image.
For example, a user wants to take a rest with his ear to take off the earphone, or the user needs to listen to other external sounds to take off the earphone, and does not want to interrupt the video viewing, the user can take off the earphone, so that when the terminal or the earphone detects that the earphone connected to the terminal is not worn and the target image includes the target face, the terminal is controlled to normally play the video image, and the terminal is controlled to pause the playing of the audio content corresponding to the video image, thereby realizing flexible control of playing of the video image and the audio content in the video playing process of the terminal, and simultaneously reducing the energy consumption of the terminal.
Step 202, if the earphone connected with the terminal is not worn and the target image does not contain the target face, controlling the terminal to pause playing the video picture and controlling the terminal to pause playing the audio content corresponding to the video picture.
For example, when a user needs to interrupt video playing due to various things in the process of watching a video, the user can directly handle other things by picking off an earphone, and at this time, when the terminal or the earphone detects that the earphone connected with the terminal is not worn and the target image does not contain a target face, the user directly controls the terminal to pause playing a video picture and controls the terminal to pause playing audio content corresponding to the video picture.
And 203, if the earphone connected with the terminal is in a normal wearing state and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to play audio content corresponding to the video picture.
For example, when the user has finished processing other things and needs to continue watching video, the user can control the terminal to continue playing video pictures and control the terminal to play audio content corresponding to the video pictures by detecting that the headset connected with the terminal is in a normal wearing state through the terminal or the headset, and the target image contains a target face, without controlling the playing and pausing of the video through a playing control key of a video playing application, so that the flexibility of controlling the playing of the video pictures and the audio content in the video playing process of the terminal is improved.
And 204, if the earphone connected with the terminal is in a normal wearing state and the target image does not contain the target face, controlling the terminal to play audio content and controlling the terminal to pause playing the video picture corresponding to the audio content.
For example, the user may want to have a break in his eyes and, without interrupting the video viewing, may view the video by placing the terminal aside and listening to the audio content played by the headphones directly. At this moment, the terminal or the earphone can control the terminal to continue playing the audio content and control the terminal to pause playing the video picture corresponding to the audio content by detecting that the earphone connected with the terminal is in a normal wearing state and the target image does not contain the target face, so that the flexible control of playing the video picture and the audio content in the video playing process of the terminal is realized, and meanwhile, the energy consumption of the terminal is reduced.
As can be seen from the embodiments shown in fig. 1 and fig. 2, in the embodiments of the present application, independent control, rather than synchronous control, of playing and pausing of a video picture and audio content of a terminal in a video playing process can be realized according to a wearing state of an earphone connected to the terminal and a face recognition result of the target image; namely, in the process of controlling the terminal to realize the pause and the play of the video picture, the pause and the play of the audio content of the terminal are not influenced; in the process of controlling the terminal to realize the pause and the play of the audio content, the pause and the play of the video picture of the terminal are not influenced, the flexibility of the video play control of the terminal is effectively improved, and meanwhile, the energy consumption of the terminal can be reduced to a certain extent.
In some embodiments of the present application, to further reduce power consumption of the terminal, after controlling the terminal to pause playing the video picture, the method may further include: and adjusting the display brightness of the terminal to be the lowest display brightness, and recovering the display brightness of the display screen of the terminal after the terminal recovers the playing of the video picture.
For example, after the terminal is controlled to pause playing of the video picture, if the target image is detected to contain the target face, the playing of the video picture can be resumed, and the display brightness of the display screen of the terminal is resumed.
In addition, in order to reduce the energy consumption of the terminal, as shown in fig. 3, in some embodiments of the present application, the detecting, in step 101, a wearing state of an earphone connected to the terminal and acquiring a target image captured by a camera of the terminal may further include: step 301 to step 302.
Step 301, detecting whether the motion state of the terminal changes.
In some embodiments of the present application, the detecting whether the motion state of the terminal changes may include: the method comprises the steps of detecting whether the acceleration of the terminal is larger than an acceleration threshold value or not by using an acceleration sensor, if the acceleration of the terminal is larger than the acceleration threshold value, determining that the motion state of the terminal is changed, and if the acceleration of the terminal is smaller than or equal to the acceleration threshold value, determining that the motion state of the terminal is not changed.
Step 302, if the motion state of the terminal changes, detecting the wearing state of an earphone connected with the terminal, and acquiring a target image acquired by a camera of the terminal.
Since a user generally keeps a certain relatively fixed motion state of the terminal in the process of normally watching a video by using the terminal, for example, the terminal is kept in a relatively static state by holding the terminal by hand, that is, the motion state of the terminal is kept unchanged, and at this time, the user generally does not want the video playing state of the terminal to be changed. Therefore, in order to reduce the energy consumption of the terminal, in the embodiment of the present application, when it is detected that the motion state of the terminal is not changed, the wearing state of the earphone connected to the terminal may be suspended, and the acquisition of the target image acquired by the camera of the terminal may be suspended. And only when the change of the motion state of the terminal is detected, detecting the wearing state of an earphone connected with the terminal, and acquiring a target image acquired by a camera of the terminal.
In some embodiments of the present application, as shown in fig. 4, the detecting a wearing state of an earphone connected to the terminal may specifically include: step 401 to step 403.
Step 401, obtaining an earphone temperature value detected by a temperature sensor of the earphone.
The temperature sensor of the earphone is a temperature sensor arranged on the inner side (the side contacting with the ear of the user) of the earphone.
And 402, judging whether the earphone temperature value is in a preset human body temperature threshold interval.
Step 403, if the earphone temperature value is outside the preset human body temperature threshold value interval, determining that the earphone connected with the terminal is detected to be in an unworn state.
In the embodiment of the present application, the temperature value in the preset human body temperature threshold interval is a temperature close to the body temperature of the user, for example, the preset human body temperature threshold interval may be (35, 39). Therefore, when the earphone temperature value is within the preset human body temperature threshold value interval, it indicates that the earphone can detect a temperature close to the body temperature of the user, and therefore, the earphone is likely to be worn on the ear of the user (in a normal wearing state), and when the earphone temperature value detected by the earphone is outside the preset human body temperature threshold value interval, it indicates that the temperature detected by the earphone is greatly different from the body temperature of the user, and therefore, it can be determined with a relatively high probability that the earphone is not worn on the ear of the user (in an unworn state).
Therefore, in the embodiment of the application, whether the earphone temperature value is within the preset human body temperature threshold interval or not can be judged, and when the earphone temperature value is outside the preset human body temperature threshold interval, it is determined that the earphone connected with the terminal is detected to be in an unworn state. And when the earphone temperature value is located in the preset human body temperature threshold interval, as shown in fig. 4, in some embodiments of the present application, the determining whether the earphone temperature value is located after the preset human body temperature threshold interval may further include: step 404 to step 407.
Step 404, if the earphone temperature value is within the preset human body temperature threshold interval, acquiring an environment temperature value detected by a temperature sensor of the terminal, and calculating a difference value between the earphone temperature value and the environment temperature value.
Step 405, if the difference is smaller than a preset difference threshold, obtaining a distance value detected by a distance sensor of the earphone and/or a light intensity value detected by a light sensor of the earphone.
Step 406, if the distance value is greater than a preset distance threshold or the light intensity value is greater than a preset light intensity threshold, determining that the headset connected with the terminal is detected to be in an unworn state.
Step 407, if the difference is greater than or equal to the preset difference threshold, or the distance value is less than or equal to the preset distance threshold and the light intensity value is less than or equal to the preset light intensity threshold, it is determined that the headset connected to the terminal is detected to be in a normal wearing state.
In the embodiment of the application, when the earphone temperature value is in the preset human body temperature threshold interval and the difference value between the earphone temperature value and the environment temperature value is smaller than the preset difference threshold, the temperature value of the environment where the terminal is located is closer to the body temperature of the human body, that is, the earphone temperature value detected by the earphone may be the ambient temperature, or may be the human body temperature, therefore, the wearing state of the earphone connected with the terminal cannot be determined only by judging whether the earphone temperature value is within the preset human body temperature threshold interval and judging whether the difference value between the earphone temperature value and the environment temperature value is smaller than the preset difference value threshold, therefore, it is necessary to obtain a distance value detected by a distance sensor of the headset and/or a light intensity value detected by a light sensor of the headset to assist in determining a wearing state of the headset connected to the terminal.
When the earphone is worn on the ear of the user, the distance value detected by the distance sensor is smaller, and the light intensity value detected by the light sensor of the earphone is also smaller, so that when the distance value is greater than a preset distance threshold value or the light intensity value is greater than a preset light intensity threshold value, the earphone connected with the terminal is determined to be detected to be in an unworn state; and when the distance value is smaller than or equal to the preset distance threshold value and the light intensity value is smaller than or equal to the preset light intensity threshold value, determining that the earphone connected with the terminal is detected to be in a normal wearing state.
And when the difference value between the earphone temperature value and the environment temperature value is greater than or equal to the preset difference threshold value, the temperature value representing the environment where the terminal is located is greater than the earphone temperature value detected by the earphone, so that the earphone temperature value detected by the earphone is not the environment temperature value, and the earphone temperature value detected by the earphone is indirectly close to the body temperature, and therefore, when the difference value is greater than or equal to the preset difference threshold value, the earphone connected with the terminal is determined to be in a normal wearing state.
It should be noted that for simplicity of description, the aforementioned method embodiments are all presented as a series of combinations of acts, but those skilled in the art will appreciate that the present invention is not limited by the order of acts described, as some steps may occur in other orders in accordance with the present invention.
Fig. 5 shows a schematic structural diagram of a control apparatus 500 for video playing provided in an embodiment of the present application, and includes a detection unit 501, an identification unit 502, and a control unit 503.
The detection unit is used for detecting the wearing state of an earphone connected with the terminal and acquiring a target image acquired by a camera of the terminal;
the recognition unit is used for carrying out face recognition on the target image to obtain a face recognition result of the target image;
and the control unit is used for respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image.
In some embodiments of the present application, the control unit 503 is further specifically configured to:
if the earphone connected with the terminal is not worn and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to pause playing of audio content corresponding to the video picture;
if the earphone connected with the terminal is in an unworn state and the target image does not contain a target face, controlling the terminal to pause playing of a video picture and controlling the terminal to pause playing of audio content corresponding to the video picture;
if the earphone connected with the terminal is in a normal wearing state and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to play audio content corresponding to the video picture;
and if the earphone connected with the terminal is in a normal wearing state and the target image does not contain the target face, controlling the terminal to play audio content and controlling the terminal to pause playing of a video picture corresponding to the audio content.
In some embodiments of the present application, the control unit 503 is further specifically configured to: and after the terminal is controlled to pause playing of the video picture, adjusting the display brightness of the terminal to be the lowest display brightness.
In some embodiments of the present application, the detecting unit 501 is further specifically configured to: detecting whether the motion state of the terminal is changed; and if the motion state of the terminal changes, detecting the wearing state of an earphone connected with the terminal, and acquiring a target image acquired by a camera of the terminal.
In some embodiments of the present application, the detecting unit 501 is further specifically configured to:
acquiring an earphone temperature value detected by a temperature sensor of the earphone;
judging whether the earphone temperature value is within a preset human body temperature threshold interval or not;
and if the earphone temperature value is outside the preset human body temperature threshold value interval, determining that the earphone connected with the terminal is detected to be in an unworn state.
In some embodiments of the present application, the detecting unit 501 is further specifically configured to: after judging whether the earphone temperature value is in a preset human body temperature threshold interval or not, if the earphone temperature value is in the preset human body temperature threshold interval, acquiring an environment temperature value detected by a temperature sensor of the terminal, and calculating a difference value between the earphone temperature value and the environment temperature value; if the difference is smaller than a preset difference threshold value, acquiring a distance value detected by a distance sensor of the earphone and/or a light intensity value detected by a light sensor of the earphone; and if the distance value is greater than a preset distance threshold value or the light intensity value is greater than a preset light intensity threshold value, determining that the earphone connected with the terminal is detected to be in an unworn state.
In some embodiments of the present application, the detecting unit 501 is further specifically configured to: after the ambient temperature value detected by the temperature sensor of the terminal is obtained and the difference between the earphone temperature value and the ambient temperature value is calculated, if the difference is greater than or equal to the preset difference threshold value, or the distance value is less than or equal to the preset distance threshold value and the light intensity value is less than or equal to the preset light intensity threshold value, it is determined that the earphone connected with the terminal is detected to be in a normal wearing state.
For convenience and brevity of description, the specific working process of the control apparatus 500 for video playing described above may refer to the corresponding process of the method described in fig. 1 to fig. 4, and is not repeated herein.
As shown in fig. 6, the present application provides a terminal for implementing the above-mentioned video playing control method, where the terminal may include: a processor 61, a memory 62, one or more input devices 63 (only one shown in fig. 6), and one or more output devices 64 (only one shown in fig. 6). The processor 61, memory 62, input device 63 and output device 64 are connected by a bus 65.
It should be understood that in the embodiment of the present Application, the Processor 61 may be a Central Processing Unit (CPU), and the Processor may also be other general processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 63 may include a virtual keyboard, a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, etc., and the output device 64 may include a display, a speaker, etc.
The memory 62 may include a read-only memory and a random access memory, and provides instructions and data to the processor 61. Some or all of the memory 62 may also include non-volatile random access memory. For example, the memory 62 may also store device type information.
The memory 62 stores a computer program that can be executed by the processor 61, and the computer program is, for example, a program of a control method of video playback. When the processor 61 executes the computer program, steps in the embodiment of the control method for playing the video are implemented, for example, steps 101 to 103 shown in fig. 1. Alternatively, the processor 61 may implement the functions of the units in the device embodiment, such as the functions of the units 501 to 503 shown in fig. 5, when executing the computer program.
The computer program may be divided into one or more modules/units, which are stored in the memory 62 and executed by the processor 61 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program in the first terminal for controlling video playing. For example, the computer program may be divided into a detection unit, an identification unit, and a control unit, and each unit may specifically function as follows:
the detection unit is used for detecting the wearing state of an earphone connected with the terminal and acquiring a target image acquired by a camera of the terminal;
the recognition unit is used for carrying out face recognition on the target image to obtain a face recognition result of the target image;
and the control unit is used for respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application provides a computer program product, which when running on a terminal device, enables the terminal device to implement the steps of the video playing control method in the foregoing embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal are merely illustrative, and for example, the division of the above-described modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above may be implemented by a computer program, which may be stored in a computer readable storage medium and used by a processor to implement the steps of the embodiments of the methods described above. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-described computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may include content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A method for controlling video playback, comprising:
detecting the wearing state of an earphone connected with a terminal, and acquiring a target image acquired by a camera of the terminal;
carrying out face recognition on the target image to obtain a face recognition result of the target image;
respectively controlling the playing and pausing of the video picture and the audio content of the terminal according to the wearing state of an earphone connected with the terminal and the face recognition result of the target image;
the controlling the playing and the pausing of the video picture and the audio content of the terminal respectively according to the wearing state of the earphone connected with the terminal and the face recognition result of the target image comprises the following steps:
if the earphone connected with the terminal is not worn and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to pause playing of audio content corresponding to the video picture;
if the earphone connected with the terminal is in an unworn state and the target image does not contain a target face, controlling the terminal to pause playing of a video picture and controlling the terminal to pause playing of audio content corresponding to the video picture;
if the earphone connected with the terminal is in a normal wearing state and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to play audio content corresponding to the video picture;
and if the earphone connected with the terminal is in a normal wearing state and the target image does not contain the target face, controlling the terminal to play audio content and controlling the terminal to pause playing of a video picture corresponding to the audio content.
2. The method for controlling according to claim 1, wherein after said controlling said terminal to pause playing the video picture, comprising:
and adjusting the display brightness of the terminal to be the lowest display brightness.
3. The control method according to any one of claims 1 to 2, wherein the detecting a wearing state of an earphone connected to a terminal and acquiring a target image captured by a camera of the terminal includes:
detecting whether the motion state of the terminal is changed;
and if the motion state of the terminal changes, detecting the wearing state of an earphone connected with the terminal, and acquiring a target image acquired by a camera of the terminal.
4. The control method according to any one of claims 1 to 2, wherein the detecting of the wearing state of the headset connected to the terminal includes:
acquiring an earphone temperature value detected by a temperature sensor of the earphone;
judging whether the earphone temperature value is within a preset human body temperature threshold interval or not;
and if the earphone temperature value is outside the preset human body temperature threshold value interval, determining that the earphone connected with the terminal is detected to be in an unworn state.
5. The control method according to claim 4, wherein after the determining whether the earphone temperature value is within a preset human body temperature threshold interval, the method further comprises:
if the earphone temperature value is within the preset human body temperature threshold value interval, acquiring an environment temperature value detected by a temperature sensor of the terminal, and calculating a difference value between the earphone temperature value and the environment temperature value;
if the difference is smaller than a preset difference threshold value, acquiring a distance value detected by a distance sensor of the earphone and/or a light intensity value detected by a light sensor of the earphone;
and if the distance value is greater than a preset distance threshold value or the light intensity value is greater than a preset light intensity threshold value, determining that the earphone connected with the terminal is detected to be in an unworn state.
6. The control method according to claim 5, wherein after the obtaining an ambient temperature value detected by a temperature sensor of the terminal and calculating a difference between the earphone temperature value and the ambient temperature value, the method comprises:
if the difference is greater than or equal to the preset difference threshold, or the distance value is less than or equal to the preset distance threshold and the light intensity value is less than or equal to the preset light intensity threshold, it is determined that the earphone connected with the terminal is detected to be in a normal wearing state.
7. A control apparatus for video playback, comprising:
the detection unit is used for detecting the wearing state of an earphone connected with the terminal and acquiring a target image acquired by a camera of the terminal;
the recognition unit is used for carrying out face recognition on the target image to obtain a face recognition result of the target image;
the control unit is used for respectively controlling the playing and the pausing of the video picture and the audio content of the terminal according to the wearing state of an earphone connected with the terminal and the face recognition result of the target image;
the control unit is specifically configured to, when playing and pausing of the video picture and the audio content of the terminal is respectively controlled according to a wearing state of an earphone connected to the terminal and a face recognition result of the target image:
if the earphone connected with the terminal is in an unworn state and the target image does not contain a target face, controlling the terminal to pause playing of a video picture and controlling the terminal to pause playing of audio content corresponding to the video picture;
if the earphone connected with the terminal is in a normal wearing state and the target image contains a target face, controlling the terminal to play a video picture and controlling the terminal to play audio content corresponding to the video picture;
and if the earphone connected with the terminal is in a normal wearing state and the target image does not contain the target face, controlling the terminal to play audio content and controlling the terminal to pause playing of a video picture corresponding to the audio content.
8. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN202010302812.7A 2020-04-16 2020-04-16 Video playing control method, device, terminal and computer readable storage medium Active CN111510785B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010302812.7A CN111510785B (en) 2020-04-16 2020-04-16 Video playing control method, device, terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010302812.7A CN111510785B (en) 2020-04-16 2020-04-16 Video playing control method, device, terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111510785A CN111510785A (en) 2020-08-07
CN111510785B true CN111510785B (en) 2022-01-28

Family

ID=71877510

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010302812.7A Active CN111510785B (en) 2020-04-16 2020-04-16 Video playing control method, device, terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111510785B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114257754A (en) * 2020-09-21 2022-03-29 Oppo广东移动通信有限公司 Video synthesis method, device, terminal and storage medium
CN112584083B (en) * 2020-11-02 2022-05-27 广州视源电子科技股份有限公司 Video playing method, system, electronic equipment and storage medium
TWI762257B (en) * 2021-03-29 2022-04-21 中華電信股份有限公司 Method for recommending viewing directions in virtyal reality video and computer readable medium thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108319440A (en) * 2017-12-21 2018-07-24 维沃移动通信有限公司 Audio-frequency inputting method and mobile terminal
CN108683798A (en) * 2018-04-19 2018-10-19 维沃移动通信有限公司 A kind of sound output control method and mobile terminal

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180048750A1 (en) * 2012-06-15 2018-02-15 Muzik, Llc Audio/video wearable computer system with integrated projector
US9219967B2 (en) * 2013-11-25 2015-12-22 EchoStar Technologies, L.L.C. Multiuser audiovisual control
CN106020510B (en) * 2016-05-17 2019-05-03 Oppo广东移动通信有限公司 The control method and device of terminal
CN106161804A (en) * 2016-08-31 2016-11-23 维沃移动通信有限公司 A kind of audio play control method and mobile terminal
CN106791191A (en) * 2017-02-07 2017-05-31 广东小天才科技有限公司 A kind of control method for playing back and device
CN107205187B (en) * 2017-06-21 2019-10-01 深圳市冠旭电子股份有限公司 A kind of method, apparatus and terminal device based on earphone control music
CN107633853B (en) * 2017-08-03 2020-07-03 广东小天才科技有限公司 Control method for playing audio and video files and user terminal
CN107371058A (en) * 2017-08-04 2017-11-21 深圳市创维软件有限公司 A kind of player method, smart machine and the storage medium of multimedia file sound intermediate frequency data
CN108235084B (en) * 2018-02-09 2021-01-08 维沃移动通信有限公司 Video playing method and mobile terminal
CN110401806A (en) * 2019-06-21 2019-11-01 努比亚技术有限公司 A kind of video call method of mobile terminal, mobile terminal and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108319440A (en) * 2017-12-21 2018-07-24 维沃移动通信有限公司 Audio-frequency inputting method and mobile terminal
CN108683798A (en) * 2018-04-19 2018-10-19 维沃移动通信有限公司 A kind of sound output control method and mobile terminal

Also Published As

Publication number Publication date
CN111510785A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
CN111510785B (en) Video playing control method, device, terminal and computer readable storage medium
CN111050250B (en) Noise reduction method, device, equipment and storage medium
CN107908929B (en) Method and device for playing audio data
CN107656718A (en) A kind of audio signal direction propagation method, apparatus, terminal and storage medium
CN111402913A (en) Noise reduction method, device, equipment and storage medium
WO2020249025A1 (en) Identity information determining method and apparatus, and storage medium
CN112866576B (en) Image preview method, storage medium and display device
CN109982129B (en) Short video playing control method and device and storage medium
US20240005695A1 (en) Fingerprint Recognition Method and Electronic Device
WO2022028083A1 (en) Noise reduction method and apparatus for electronic device, storage medium and electronic device
CN109102811B (en) Audio fingerprint generation method and device and storage medium
CN111459363A (en) Information display method, device, equipment and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN111416996B (en) Multimedia file detection method, multimedia file playing device, multimedia file equipment and storage medium
CN110708630A (en) Method, device and equipment for controlling earphone and storage medium
CN109889858B (en) Information processing method and device for virtual article and computer readable storage medium
US11037519B2 (en) Display device having display based on detection value, program, and method of controlling device
CN112133319A (en) Audio generation method, device, equipment and storage medium
CN111931712A (en) Face recognition method and device, snapshot machine and system
CN109360577B (en) Method, apparatus, and storage medium for processing audio
CN109144461B (en) Sound production control method and device, electronic device and computer readable medium
CN110660032A (en) Object shielding method, object shielding device and electronic equipment
CN111050211A (en) Video processing method, device and storage medium
CN111711841B (en) Image frame playing method, device, terminal and storage medium
CN109344284B (en) Song file playing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant