CN108073272A - A kind of control method for playing back and device for smart machine - Google Patents

A kind of control method for playing back and device for smart machine Download PDF

Info

Publication number
CN108073272A
CN108073272A CN201611031295.4A CN201611031295A CN108073272A CN 108073272 A CN108073272 A CN 108073272A CN 201611031295 A CN201611031295 A CN 201611031295A CN 108073272 A CN108073272 A CN 108073272A
Authority
CN
China
Prior art keywords
playing
multimedia data
information
user
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611031295.4A
Other languages
Chinese (zh)
Inventor
郭瑞
郭祥
雷宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Intelligent Housekeeper Technology Co Ltd
Original Assignee
Beijing Intelligent Housekeeper Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Intelligent Housekeeper Technology Co Ltd filed Critical Beijing Intelligent Housekeeper Technology Co Ltd
Priority to CN201611031295.4A priority Critical patent/CN108073272A/en
Publication of CN108073272A publication Critical patent/CN108073272A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a kind of control method for playing back and device for smart machine.This method includes:Obtain the acoustic information and/or facial expression information of user when smart machine plays multi-medium data;Corresponding customer responsiveness information is determined according to the acoustic information and/or facial expression information;If the multi-medium data and/or the customer responsiveness information meet default interrupt condition, the mode of the multi-medium data is played according to interrupt condition adjustment and/or plays corresponding user interaction sentence.Smart machine plays the mode of the multi-medium data according to the customer responsiveness information adjustment obtained when playing multi-medium data and/or plays corresponding user interaction sentence in the present embodiment, enables smart machine when playing multi-medium data according to the different scenes run into and the corresponding interaction of user's progress.

Description

Playing control method and device for intelligent equipment
Technical Field
The embodiment of the invention relates to artificial intelligence, in particular to a playing control method and device for intelligent equipment.
Background
Interactive media refers to: the audience and the media or the audience and the audience can achieve an interactive state by means of the platform of the media. Namely, the media communication between people and machine and between people can be realized. Interactive media has greatly enhanced interactivity with the audience as compared to non-interactive media. Interactive media is generally based on a plurality of technologies such as modern network technology, digital technology, computer technology and the like. Common interactive media such as computers, computer networks, multimedia teaching platforms, and various developed interactive systems such as interactive teaching platforms, interactive video learning systems, etc.
Most multimedia playing devices can only play multimedia, cannot interact with people, even if the multimedia playing devices can interact with people, the interaction mode is boring and single, and manual operation is needed to realize when the multimedia playing devices interact with people and what kind of interaction is needed.
Disclosure of Invention
The embodiment of the invention provides a play control method and device for intelligent equipment, which can timely interact with a user.
In a first aspect, an embodiment of the present invention provides a play control method for an intelligent device, including: acquiring sound information and/or facial expression information of a user when the intelligent equipment plays multimedia data; determining corresponding user reaction information according to the sound information and/or the facial expression information; and if the multimedia data and/or the user response information meet a preset interruption condition, adjusting a mode for playing the multimedia data and/or playing a corresponding user interactive statement according to the interruption condition.
In a second aspect, an embodiment of the present invention further provides a play control apparatus for an intelligent device, including: the information acquisition unit is used for acquiring sound information and/or facial expression information of a user when the intelligent equipment plays multimedia data; the information processing unit is connected with the information acquisition unit and used for determining corresponding user reaction information according to the sound information and/or the facial expression information; and the adjusting unit is connected with the information processing unit and is used for adjusting the mode of playing the multimedia data and/or playing the corresponding user interactive statement according to the interruption condition if the multimedia data and/or the user response information meet the preset interruption condition.
According to the embodiment of the invention, the intelligent equipment adjusts the mode of playing the multimedia data and/or plays the corresponding user interaction sentence according to the user response information acquired when the multimedia data is played, so that the intelligent equipment can perform corresponding interaction with the user according to different encountered scenes when the multimedia data is played.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 is a flowchart of a play control method for a smart device according to a first embodiment of the present invention;
fig. 2 is a flowchart of a play control method for a smart device according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a play control apparatus for a smart device according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a play control apparatus for an intelligent device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should also be noted that, for the convenience of description, only some but not all of the matters related to the present invention are shown in the drawings. It should be further noted that, for convenience of description, examples related to the present invention are shown in the following embodiments, which are used only for illustrating the principles of the embodiments of the present invention and are not meant to limit the embodiments of the present invention, and the specific values of the examples may vary according to different application environments and parameters of the apparatus or the components.
Example one
Fig. 1 is a flowchart of a play control method for an intelligent device according to a first embodiment of the present invention, where this embodiment is applicable to a case where the intelligent device performs human-computer interaction when playing multimedia content, and the method may be executed by a play control apparatus of the intelligent device, where the apparatus may be implemented by software and/or hardware, and the apparatus may be integrated in any intelligent device providing human-computer interaction, for example, typically, a user terminal device, such as a mobile phone, a tablet computer, and the like. The method comprises the following steps:
s101, sound information and/or facial expression information of a user when the intelligent device plays multimedia data are obtained.
The intelligent device starts playing the multimedia data after confirming that the user is ready. The user information comprises user requirements, user emotion and user attention. The user requirements, the user emotion and the user attention can be acquired through voice inquiry, voice recognition or image recognition and the like. For example, if the image recognizes that the user is far away or the user is not paying attention to the smart device, it indicates that the user is not paying attention currently. And the intelligent equipment performs corresponding interactive operation according to the acquired user information.
And S102, determining corresponding user reaction information according to the sound information and/or the facial expression information.
The voice information and/or facial expression information of the user includes human emotions such as happiness, anger, sadness, happiness, love, hate, surprise, terror or apprehension, and the appeal is hidden behind the words spoken by the user. And determining the emotion or appeal of the user according to the sound information and/or the facial expression information of the user. For example, the user may be asked that he is not answering or that he is speech recognizing a snoring sound, it may be determined that he is not currently concerned. And the intelligent equipment performs man-machine interaction according to the determined emotion or appeal of the user.
S103, if the multimedia data and/or the user response information meet a preset interruption condition, adjusting a mode for playing the multimedia data and/or playing a corresponding user interaction statement according to the interruption condition.
And if the multimedia data and/or the user information meet a preset interruption condition, the intelligent equipment identifies a corresponding interaction opportunity and then performs corresponding interaction operation. For example, user requirements are identified through user voice or expression, interaction is carried out according to the requirements, the requirements are various, if a user inquires about an element in the content, the user is interested in the element, and interaction is carried out with the user according to the relation with the element; inquiring whether the user stops or changes the playing content if the user shows a boring state; if the user shows a happy state, the user is queried and interacts according to the feedback of the user.
In this embodiment, the intelligent device adjusts a manner of playing the multimedia data and/or plays a corresponding user interaction sentence according to the user response information obtained when the multimedia data is played, so that the intelligent device can perform corresponding interaction with the user according to different encountered situations when the multimedia data is played.
Example two
Fig. 2 is a flowchart of a playing control method for an intelligent device according to a second embodiment of the present invention, where the second embodiment is based on the above embodiments, and the manner of playing the multimedia data and/or the corresponding user interaction statements are adjusted according to the interruption condition and optimized according to different situations.
The play control method for the smart device provided by the present embodiment includes step S201, step S202, and step S203. Step S201 is the same as step S101 in the first embodiment, and step S202 is the same as step S102 in the first embodiment, and the same steps are not repeated.
S201, sound information and/or facial expression information of a user when the intelligent device plays multimedia data are obtained.
S202, determining corresponding user reaction information according to the sound information and/or the facial expression information.
S203, if the interruption condition is that the multimedia data is played to a preset plot, a preset event or a preset time, stopping playing the multimedia data and playing a preset interactive statement; or if the interruption condition is that the user response information is determined that the user does not pay attention to the multimedia data, stopping playing the multimedia data; or if the interruption condition is that the user response information is determined that the user pays attention to the multimedia data, stopping playing the multimedia data and playing a preset interactive statement; or, if the interruption condition is that it is determined that the user response information is in a preset emotional state, stopping playing the multimedia data, and playing a preset interactive sentence, where the preset interactive sentence corresponds to a different emotional state.
And the intelligent equipment performs different interactions according to different interaction opportunities. The played multimedia data is preprocessed in advance, namely the multimedia data is segmented according to different plots, events or time and the like through manual or technical means, interactive sentences are set for each segmentation point, the multimedia data is played to a preset plot, a preset event or preset time, the playing of the multimedia data is suspended, the preset interactive sentences are played, and the interactive sentences can be carried out in a question-answering mode; if the user is determined not to pay attention to the multimedia data, stopping playing the multimedia data, reducing the volume, inquiring the user, and the like; if the user is determined to pay attention to the multimedia data, interaction with the user can be carried out by adopting preset logic; interactive logics corresponding to feelings of happiness, anger, sadness, happiness, love, hate, surprise, terror, worry and the like are set in advance, if the emotional state of the user is determined, the multimedia data is paused to be played, and the set interactive logics are adopted to interact with the user.
Different interaction scenes can be configured according to needs when the multimedia data are played to a preset plot, a preset event or preset time, the user does not pay attention to the multimedia data, the user pays attention to the multimedia data and the preset emotional state is determined to belong to different interaction scenes, if the interaction scene is triggered in the process of playing the multimedia data, the playing of the multimedia data is suspended, and the interaction operation under the interaction scene is executed.
Further, when the playing of the multimedia data is stopped, information of an interruption point of the multimedia data is recorded.
And after the interactive operation is executed, if the multimedia data is not played completely, continuing to play the multimedia data from the interruption point recorded during the pause of the playing.
Further, the playing the preset interactive sentence comprises: playing a preset interactive sub-sentence, and acquiring feedback information of a user to the interactive sub-sentence; and determining the next interactive sub-sentence to be played according to the feedback information and playing the next interactive sub-sentence until the playing ending condition of the interactive sentence is reached.
And playing the preset interactive sub-sentence when the interaction is started, playing the corresponding next interactive sub-sentence according to the feedback information of the user until the intention of the user is determined, and executing operations of playing the corresponding multimedia data or stopping playing and the like according to the intention of the user.
The embodiment performs corresponding interactive operation according to different interactive scenes, and learns the intention of the user through interactive statements.
EXAMPLE III
Fig. 3 is a schematic structural diagram of a playing control apparatus for an intelligent device according to a third embodiment of the present invention, which is used for executing the playing control method for the intelligent device in the foregoing embodiments. The device includes: an information acquisition unit 301, an information processing unit 302, and an adjustment unit 303.
An information obtaining unit 301, configured to obtain sound information and/or facial expression information of a user when the smart device plays multimedia data.
And an information processing unit 302, connected to the information acquiring unit 301, for determining corresponding user response information according to the sound information and/or facial expression information.
An adjusting unit 303, connected to the information processing unit 302, configured to adjust a manner of playing the multimedia data and/or play a corresponding user interactive statement according to a preset interrupt condition if the multimedia data and/or the user response information meet the preset interrupt condition.
In this embodiment, the intelligent device adjusts a manner of playing the multimedia data and/or plays a corresponding user interaction sentence according to the user information obtained when the multimedia data is played, so that the intelligent device can perform corresponding interaction with the user according to different encountered situations when the multimedia data is played.
Further, the adjusting unit is specifically configured to: if the interruption condition is that the multimedia data is played to a preset plot, a preset event or a preset time, stopping playing the multimedia data and playing a preset interactive statement; or if the interruption condition is that the user response information is determined that the user does not pay attention to the multimedia data, stopping playing the multimedia data; or if the interruption condition is that the user response information is determined that the user pays attention to the multimedia data, stopping playing the multimedia data and playing a preset interactive statement; or, if the interruption condition is that it is determined that the user response information is in a preset emotional state, stopping playing the multimedia data, and playing a preset interactive sentence, where the preset interactive sentence corresponds to a different emotional state.
The embodiment performs corresponding interactive operation according to different interactive scenes, and learns the intention of the user through interactive statements.
Further, the adjusting unit is further specifically configured to: playing a preset interactive sub-sentence, and acquiring feedback information of a user to the interactive sub-sentence; and determining the next interactive sub-sentence to be played according to the feedback information and playing the next interactive sub-sentence until the playing ending condition of the interactive sentence is reached.
Further, the adjusting unit is further specifically configured to: and when the multimedia data stops playing, recording the information of the interruption point of the multimedia data.
The play control device for the intelligent device provided by the embodiment of the invention can be used for executing the play control method for the intelligent device provided by any embodiment of the invention, and has corresponding functions and beneficial effects for executing the method.
Example four
Fig. 4 is a schematic structural diagram of a play control apparatus for an intelligent device according to a fourth embodiment of the present invention. In fig. 4, the smart device 400 includes a display 401, a speaker 402, a camera 403, and a microphone 404. The present embodiment is an application of the technical solution of the above embodiment, and is an optional implementation.
The interaction process between the user and the intelligent device in the embodiment is as follows: the intelligent device collects facial images of the user through the camera every ten minutes in the process of playing the hypnotic songs to the user, compares the collected images with images prestored in the expression library, and executes operations of reducing volume or stopping playing the hypnotic songs and the like if the user is determined to be asleep.
In the process of playing the multimedia content, the intelligent device can automatically control the playing of the multimedia content according to the requirements of the user by acquiring the user information and automatically executing corresponding operations.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A play control method for an intelligent device, comprising:
acquiring sound information and/or facial expression information of a user when the intelligent equipment plays multimedia data;
determining corresponding user reaction information according to the sound information and/or the facial expression information;
and if the multimedia data and/or the user response information meet a preset interruption condition, adjusting a mode for playing the multimedia data and/or playing a corresponding user interactive statement according to the interruption condition.
2. The playback control method for an intelligent device according to claim 1, wherein the adjusting the manner of playing the multimedia data and/or the playing of the corresponding user interaction sentence according to the interruption condition comprises:
and if the interruption condition is that the multimedia data is played to a preset plot, a preset event or a preset time, stopping playing the multimedia data and playing a preset interactive statement.
3. The playback control method for an intelligent device according to claim 1, wherein the adjusting the manner of playing the multimedia data and/or the playing of the corresponding user interaction sentence according to the interruption condition comprises:
if the interruption condition is that the user response information is determined that the user does not pay attention to the multimedia data, stopping playing the multimedia data; or,
if the interruption condition is that the user response information is determined that the user pays attention to the multimedia data, stopping playing the multimedia data and playing a preset interactive statement; or,
and if the interruption condition is that the user response information is determined to be in a preset emotional state, stopping playing the multimedia data, and playing a preset interactive statement, wherein the preset interactive statement corresponds to different emotional states.
4. The playback control method for a smart device according to claim 2 or 3, wherein the playing back the preset interactive sentence includes:
playing a preset interactive sub-sentence, and acquiring feedback information of a user to the interactive sub-sentence;
and determining the next interactive sub-sentence to be played according to the feedback information and playing the next interactive sub-sentence until the playing ending condition of the interactive sentence is reached.
5. The playback control method for an intelligent device according to claim 2 or 3, wherein the adjusting the manner of playing the multimedia data and/or the playing the corresponding user interaction sentence according to the interruption condition further comprises:
and when the multimedia data stops playing, recording the information of the interruption point of the multimedia data.
6. A playback control apparatus for an intelligent device, comprising:
the information acquisition unit is used for acquiring sound information and/or facial expression information of a user when the intelligent equipment plays multimedia data;
the information processing unit is connected with the information acquisition unit and used for determining corresponding user reaction information according to the sound information and/or the facial expression information;
and the adjusting unit is connected with the information processing unit and is used for adjusting the mode of playing the multimedia data and/or playing the corresponding user interactive statement according to the interruption condition if the multimedia data and/or the user response information meet the preset interruption condition.
7. The playback control apparatus for smart devices as claimed in claim 6, wherein the adjusting unit is specifically configured to:
and if the interruption condition is that the multimedia data is played to a preset plot, a preset event or a preset time, stopping playing the multimedia data and playing a preset interactive statement.
8. The playback control apparatus for smart devices as claimed in claim 6, wherein the adjusting unit is further configured to:
if the interruption condition is that the user response information is determined that the user does not pay attention to the multimedia data, stopping playing the multimedia data; or,
if the interruption condition is that the user response information is determined that the user pays attention to the multimedia data, stopping playing the multimedia data and playing a preset interactive statement; or,
and if the interruption condition is that the user response information is determined to be in a preset emotional state, stopping playing the multimedia data, and playing a preset interactive statement, wherein the preset interactive statement corresponds to different emotional states.
9. The playback control apparatus for an intelligent device according to claim 7 or 8, wherein the adjusting unit is further specifically configured to:
playing a preset interactive sub-sentence, and acquiring feedback information of a user to the interactive sub-sentence;
and determining the next interactive sub-sentence to be played according to the feedback information and playing the next interactive sub-sentence until the playing ending condition of the interactive sentence is reached.
10. The playback control apparatus for an intelligent device according to claim 7 or 8, wherein the adjusting unit is further specifically configured to:
and when the multimedia data stops playing, recording the information of the interruption point of the multimedia data.
CN201611031295.4A 2016-11-18 2016-11-18 A kind of control method for playing back and device for smart machine Pending CN108073272A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611031295.4A CN108073272A (en) 2016-11-18 2016-11-18 A kind of control method for playing back and device for smart machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611031295.4A CN108073272A (en) 2016-11-18 2016-11-18 A kind of control method for playing back and device for smart machine

Publications (1)

Publication Number Publication Date
CN108073272A true CN108073272A (en) 2018-05-25

Family

ID=62160829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611031295.4A Pending CN108073272A (en) 2016-11-18 2016-11-18 A kind of control method for playing back and device for smart machine

Country Status (1)

Country Link
CN (1) CN108073272A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845786A (en) * 2018-05-31 2018-11-20 北京智能管家科技有限公司 Intelligent reading partner method, apparatus, equipment and storage medium
CN108881978A (en) * 2018-06-29 2018-11-23 百度在线网络技术(北京)有限公司 Resource playing method and device for smart machine
CN110297940A (en) * 2019-06-13 2019-10-01 百度在线网络技术(北京)有限公司 Play handling method, device, equipment and storage medium
CN110750161A (en) * 2019-10-25 2020-02-04 郑子龙 Interactive system, method, mobile device and computer readable medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN102013176A (en) * 2010-12-01 2011-04-13 曹乃承 Online learning system
CN102355527A (en) * 2011-07-22 2012-02-15 深圳市无线开锋科技有限公司 Mood induction apparatus of mobile phone and method thereof
CN104298722A (en) * 2014-09-24 2015-01-21 张鸿勋 Multimedia interaction system and method
CN104582409A (en) * 2013-10-18 2015-04-29 常州市沿江通信器材厂 Heat dissipation device for network communication equipment
CN104869453A (en) * 2015-05-04 2015-08-26 小米科技有限责任公司 Video playing equipment control method and device
CN105096977A (en) * 2015-08-24 2015-11-25 来安县新元机电设备设计有限公司 Control method for multimedia playing and mobile terminal
CN105119582A (en) * 2015-09-02 2015-12-02 广东小天才科技有限公司 Method and device for automatically adjusting terminal sound
CN105872763A (en) * 2015-12-09 2016-08-17 乐视网信息技术(北京)股份有限公司 Device control method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836219A (en) * 2007-11-01 2010-09-15 索尼爱立信移动通讯有限公司 Generating music playlist based on facial expression
CN102013176A (en) * 2010-12-01 2011-04-13 曹乃承 Online learning system
CN102355527A (en) * 2011-07-22 2012-02-15 深圳市无线开锋科技有限公司 Mood induction apparatus of mobile phone and method thereof
CN104582409A (en) * 2013-10-18 2015-04-29 常州市沿江通信器材厂 Heat dissipation device for network communication equipment
CN104298722A (en) * 2014-09-24 2015-01-21 张鸿勋 Multimedia interaction system and method
CN104869453A (en) * 2015-05-04 2015-08-26 小米科技有限责任公司 Video playing equipment control method and device
CN105096977A (en) * 2015-08-24 2015-11-25 来安县新元机电设备设计有限公司 Control method for multimedia playing and mobile terminal
CN105119582A (en) * 2015-09-02 2015-12-02 广东小天才科技有限公司 Method and device for automatically adjusting terminal sound
CN105872763A (en) * 2015-12-09 2016-08-17 乐视网信息技术(北京)股份有限公司 Device control method and device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108845786A (en) * 2018-05-31 2018-11-20 北京智能管家科技有限公司 Intelligent reading partner method, apparatus, equipment and storage medium
CN108881978A (en) * 2018-06-29 2018-11-23 百度在线网络技术(北京)有限公司 Resource playing method and device for smart machine
CN108881978B (en) * 2018-06-29 2020-03-20 百度在线网络技术(北京)有限公司 Resource playing method and device for intelligent equipment
CN110297940A (en) * 2019-06-13 2019-10-01 百度在线网络技术(北京)有限公司 Play handling method, device, equipment and storage medium
CN110750161A (en) * 2019-10-25 2020-02-04 郑子龙 Interactive system, method, mobile device and computer readable medium

Similar Documents

Publication Publication Date Title
CN110634483B (en) Man-machine interaction method and device, electronic equipment and storage medium
JP6811758B2 (en) Voice interaction methods, devices, devices and storage media
CN107464555B (en) Method, computing device and medium for enhancing audio data including speech
CN110460872B (en) Information display method, device and equipment for live video and storage medium
US20210280181A1 (en) Information processing apparatus, information processing method, and program
US20130211826A1 (en) Audio Signals as Buffered Streams of Audio Signals and Metadata
US8504373B2 (en) Processing verbal feedback and updating digital video recorder (DVR) recording patterns
US10645464B2 (en) Eyes free entertainment
CN111050201B (en) Data processing method and device, electronic equipment and storage medium
CN108073272A (en) A kind of control method for playing back and device for smart machine
CN107403011B (en) Virtual reality environment language learning implementation method and automatic recording control method
WO2017084185A1 (en) Intelligent terminal control method and system based on semantic analysis, and intelligent terminal
KR102629552B1 (en) Automatically subtitle audio portions of content on computing devices
US11862153B1 (en) System for recognizing and responding to environmental noises
CN109994106A (en) A kind of method of speech processing and equipment
US11849181B2 (en) Systems and methods for applying behavioral-based parental controls for media assets
WO2021077528A1 (en) Method for interrupting human-machine conversation
CN113891150A (en) Video processing method, device and medium
Fernandes et al. A review of voice user interfaces for interactive TV
WO2019228140A1 (en) Instruction execution method and apparatus, storage medium, and electronic device
CN109977411A (en) A kind of data processing method, device and electronic equipment
CN110196900A (en) Exchange method and device for terminal
CN115460357A (en) Method, device and equipment for intelligently assisting blind person to browse video and storage medium
JP7053693B2 (en) How to end voice skills, devices, devices and storage media
JP7248615B2 (en) Output device, output method and output program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180525