CN111261158A - Function menu customization method, voice shortcut control method and robot - Google Patents

Function menu customization method, voice shortcut control method and robot Download PDF

Info

Publication number
CN111261158A
CN111261158A CN202010043201.5A CN202010043201A CN111261158A CN 111261158 A CN111261158 A CN 111261158A CN 202010043201 A CN202010043201 A CN 202010043201A CN 111261158 A CN111261158 A CN 111261158A
Authority
CN
China
Prior art keywords
robot
voice
function menu
function
menu interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010043201.5A
Other languages
Chinese (zh)
Inventor
王勇斌
杜莉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Siyixuan Robot Technology Co ltd
Original Assignee
Shanghai Siyixuan Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Siyixuan Robot Technology Co ltd filed Critical Shanghai Siyixuan Robot Technology Co ltd
Priority to CN202010043201.5A priority Critical patent/CN111261158A/en
Publication of CN111261158A publication Critical patent/CN111261158A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a function menu customizing method, a voice shortcut control method and a robot, which are used for meeting personalized voice shortcut control requirements of different customers. The intelligent household robot is provided with a microphone and a display screen, and the personalized function menu customizing method comprises the following steps: displaying a function menu interface currently stored in the intelligent household robot on the display screen when a menu editing request sent by a user is received; setting a user-defined voice shortcut for each function in the function menu interface according to user operation; and saving the modified function menu interface.

Description

Function menu customization method, voice shortcut control method and robot
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a function menu customizing method, a voice quick control method and a robot.
Background
The intelligent home is characterized in that a home is used as a platform, facilities related to home life are integrated by utilizing a comprehensive wiring technology, a network communication technology, a safety precaution technology, an automatic control technology, an audio and video technology and the like, and an efficient management system for home facilities and family schedule affairs is constructed, so that the safety, convenience, comfortableness and artistry of the home are improved, and the environment-friendly and energy-saving living environment is realized.
The intelligent household robot is a control system in an intelligent household, the intelligent household robot and a home gateway are linked through the Internet, and a user issues a command to the intelligent household robot through voice and controls various intelligent household devices in the household through a linkage effect.
However, the existing smart home robots generally execute corresponding functions according to voice shortcuts (i.e., voice commands) uniformly set by manufacturers, and the language habits and actual needs of different customers may be different, so that it is difficult to meet the personalized needs of different customers.
Disclosure of Invention
In view of the above, the invention provides a function menu customization method, a voice shortcut control method and a robot, so as to meet personalized voice shortcut control requirements of different customers.
A function menu customization method is applied to an intelligent household robot, the intelligent household robot is provided with a microphone and a display screen, and the function menu personalized customization method comprises the following steps:
displaying a function menu interface currently stored in the intelligent household robot on the display screen when a menu editing request sent by a user is received;
setting a user-defined voice shortcut for each function in the function menu interface according to user operation;
and saving the modified function menu interface.
Optionally, in the modified function menu interface, the same voice shortcut corresponds to one or more functions in the function menu interface.
Optionally, in the modified function menu interface, each function corresponds to one or more voice shortcuts.
A voice shortcut control method is applied to an intelligent household robot, a function menu interface generated according to any one of the disclosed function menu customization methods is stored in the intelligent household robot, and the voice shortcut control method comprises the following steps:
capturing a voice shortcut recorded by a user by using a microphone;
judging whether the currently recorded voice shortcut is one of the voice shortcuts of the intelligent household robot;
if yes, starting a function corresponding to the currently recorded voice control instruction.
Optionally, the intelligent household robot further comprises a camera;
the robot body of the intelligent household robot comprises a head, a neck and a movable chassis, and the camera is mounted on the head;
when the voice shortcut is the first preset content, starting a function corresponding to the currently recorded voice control instruction, including: the shooting angle aligned with the camera is adjusted by controlling the movement of the movable chassis and the nodding and shaking of the head by controlling the neck, and the shot content is output and displayed by the display.
An intelligent household robot comprises a robot body and a microcontroller;
the robot body is provided with a microphone and a display screen;
the microcontroller comprises:
the calling unit is used for displaying a function menu interface currently stored in the intelligent household robot on the display screen when a menu editing request sent by a user is received;
the human-computer interaction unit is used for setting a user-defined voice shortcut for each function in the function menu interface according to user operation;
and the storage unit is used for storing the modified function menu interface.
Optionally, in the modified function menu interface stored in the storage unit, the same voice shortcut corresponds to one or more functions in the function menu interface.
Optionally, in the modified function menu interface stored in the storage unit, each function corresponds to one or more voice shortcuts.
Optionally, the microcontroller further includes:
the voice recording unit is used for capturing a voice shortcut recorded by a user by a microphone;
the judging unit is used for judging whether the currently recorded voice shortcut is one of the voice shortcuts of the intelligent home robot;
and the execution unit is used for starting a function corresponding to the currently recorded voice control instruction when the currently recorded voice shortcut is judged to be one of the voice shortcuts of the intelligent home robot.
Optionally, the robot body further has a camera;
the robot body comprises a head, a neck and a movable chassis, and the camera is mounted on the head;
when the voice shortcut is the first preset content, the execution unit is specifically used for adjusting the shooting angle aligned with the camera by controlling the movement of the mobile chassis and controlling the neck to realize the nodding and shaking of the head when judging that the currently recorded voice shortcut is one of the voice shortcuts of the intelligent home robot, and outputting and displaying the shot content through the display.
According to the technical scheme, the programmed program is implanted into the intelligent home robot in advance, so that when a menu editing request sent by a user is received, a function menu interface currently stored in the intelligent home robot can be displayed on the display screen, and a user-defined voice shortcut is set for each function in the function menu interface according to user operation, so that different clients can combine own language habits and actual needs to define the voice shortcut, the personalized voice shortcut control requirements of different clients are met, and the man-machine interaction is more intelligent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for customizing a function menu according to an embodiment of the present invention;
fig. 2 is a schematic view of an application scenario of an intelligent home disclosed in an embodiment of the present invention;
FIG. 3 is a flowchart of a voice shortcut control method disclosed in the embodiment of the present invention;
fig. 4 is a schematic structural diagram of a microcontroller in an intelligent home robot disclosed in an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a microcontroller in another smart home robot disclosed in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention discloses a method for customizing a function menu, which is applied to an intelligent home robot, wherein the intelligent home robot has a microphone and a display screen, and the method for customizing the function menu includes:
step S01: and judging whether a menu editing request sent by a user is received, if so, entering the step S02, otherwise, returning to the step S01.
Step S02: and displaying a function menu interface currently stored in the intelligent household robot on the display screen.
Step S03: and setting a user-defined voice shortcut for each function in the function menu interface according to user operation.
Step S04: and storing the modified function menu interface, wherein various functions and voice shortcuts corresponding to the various functions are recorded.
Specifically, each function of the smart home robot is listed on a function menu interface stored in the smart home robot, for example: dialing a video call to a father, turning on a hall lamp, turning on an air conditioner, closing a curtain of a living room, turning on a television and the like. The intelligent home robot has the advantages that all functions of the intelligent home robot listed on the function menu interface have corresponding voice shortcuts, when the intelligent home robot is started for the first time, the voice shortcuts corresponding to all the functions in the function menu interface are the voice shortcuts uniformly set by manufacturers, but the language habits and the actual needs of different customers are possibly different, and in order to meet the individual requirements of different customers, the intelligent home robot allows the user to define the voice shortcuts of all the functions.
When a user wants to customize voice shortcuts of various functions in a function menu interface currently stored in the intelligent home robot, a menu editing request needs to be sent to the intelligent home robot, the intelligent home robot responds to the menu editing request and enters a menu editing mode, in the mode, the intelligent home robot displays the currently stored function menu interface on the display screen, the user-defined voice shortcuts of each function in the function menu interface are set according to user operation, and then the modified function menu interface is stored. For example, the original voice shortcut corresponding to the function of turning on the hall lantern is 'turning on the lantern', the user can define the function as 'i get home', and then the user only needs to say 'i get home' to the intelligent household robot, so that the intelligent household robot can automatically control the hall lantern in the home to be turned on.
The mode that the user sends the menu editing request to the intelligent home robot can be a voice control mode, for example, the user can make the intelligent home robot enter a menu editing mode by speaking an 'editing function menu' to the intelligent home robot through a microphone; or, the mode may also be a remote control mode through a remote controller, for example, a user may enter a menu editing mode by pressing a "menu" key in the remote controller; or, the intelligent home robot may enter the menu editing mode by a touch screen control manner, for example, by manually clicking a "menu" button in the display screen.
Optionally, in the modified function menu interface, the same voice shortcut corresponds to one or more functions in the function menu interface.
Specifically, the same voice shortcut corresponds to multiple functions in the function menu interface, which means that a user can open multiple functions corresponding to a voice at the same time as long as speaking a voice to the smart home robot, and this is suitable for a scene used to control multiple smart home devices to act at the same time. For example, the user may set the voice shortcuts corresponding to the functions of "turning on the hall lantern", "closing the curtain in the living room", and "turning on the television" as "i go home", and then the user only needs to speak "i go home" in alignment with the microphone 3 of the smart home robot, and the smart home robot may automatically control the hall lantern 32 in the home to be turned on, the curtain in the living room 31 to be turned off, and the television 33 to be turned on, as shown in fig. 2.
Optionally, based on any of the embodiments disclosed above, in the modified function menu interface, each function corresponds to one or more voice shortcuts.
Specifically, one function corresponds to a plurality of voice shortcuts, which means that a user speaks any one of a plurality of different voices to the intelligent home robot, and then the function corresponding to the plurality of voices together can be opened. For example, three people, namely dad, mom and child, in a three-family are different in the voice shortcut used for the function of turning on the television, so that the voice shortcut corresponding to turning on the television can be simultaneously set as the habitual phrase "i get home", the habitual phrase "tv play" of mom and the habitual phrase "animation" of child, and the user only needs to speak a sentence "i get home", "tv play" or "" aiming at the microphone of the smart home robot, so that the smart home robot can automatically control the television at home to be turned on.
Referring to fig. 3, an embodiment of the present invention further discloses a voice shortcut control method, which is applied to an intelligent home robot, where a function menu interface generated according to any one of the disclosed function menu customization methods is stored in the intelligent home robot, and the voice shortcut control method includes:
step S101: capturing a voice shortcut recorded by a user by using a microphone;
step S102: judging whether the currently recorded voice shortcut is one of the voice shortcuts of the intelligent household robot; if yes, the process proceeds to step S103, and if no, the process returns to step S101.
Step S103: and starting a function corresponding to the currently recorded voice control command.
Optionally, in any of the embodiments disclosed above, for example, as shown in fig. 2, the smart home robot further includes a camera 101;
the robot body of the intelligent household robot comprises a head 100, a neck 200 and a movable chassis 300, wherein a camera 101 is installed on the head 100;
when the voice shortcut is a first preset content (such as 'shooting'), the starting of the function corresponding to the currently recorded voice control instruction comprises the following steps: the shooting angle of the alignment of the camera 3 is adjusted by controlling the movement of the moving chassis 11 and the nodding and shaking of the head 100 through controlling the neck 200, the camera 3 is controlled to start shooting when the camera 3 is aligned with a shooting target, and the shooting content is output and displayed through the display.
For example, when a mother accompanies a child in a living room, the mother can speak "camera shooting" to the smart home robot, and the smart home robot adjusts the camera 3 to track and shoot the human body according to the moving direction and speed of the human body, so that a nice parent-child time is recorded, and the recording is a recording which can be turned over to look back the smell.
Wherein, for the motion of guaranteeing that the robot body does not have the restraint in patrolling and examining the region, remove chassis 300 optional wheeled drive chassis. More specifically, the wheel type driving chassis is driven in a differential mode by double driving wheels and is provided with two universal wheels for supporting, wherein the differential driving of the double driving wheels ensures that the robot body has enough flexibility in the processes of advancing and turning, and the robot body can move at any angle; two universal wheels all are furnished with damping spring, have balanced the weight distribution of robot body, can also play fine cushioning effect in the robot body motion process, ensure the steady safe removal of robot body. Alternatively, the mobile chassis 300 may also adopt a biped walking mechanism, and the working principle thereof is not described herein.
Corresponding to the embodiment of the method, the embodiment of the invention also discloses an intelligent household robot, which comprises a robot body and a microcontroller;
the robot body is provided with a microphone and a display screen;
as shown in fig. 4, the microcontroller includes:
the invoking unit 91 is configured to display a function menu interface currently stored in the smart home robot on the display screen when receiving a menu editing request sent by a user;
a human-computer interaction unit 92, configured to set a user-defined voice shortcut for each function in the function menu interface according to a user operation;
and a storage unit 93 for saving the modified function menu interface.
Optionally, in the modified function menu interface stored in the storage unit 93, the same voice shortcut corresponds to one or more functions in the function menu interface.
Optionally, in any of the embodiments of the smart home robot disclosed above, in the modified function menu interface stored in the storage unit 93, each function corresponds to one or more voice shortcuts.
Optionally, as shown in fig. 5, in any of the embodiments of the smart home robot disclosed above, the microcontroller further includes:
a voice recording unit 94 for capturing a voice shortcut recorded by the user with a microphone;
a judging unit 95, configured to judge whether a currently recorded voice shortcut is one of the voice shortcuts of the smart home robot;
and the execution unit 96 is used for starting a function corresponding to the currently recorded voice control instruction when the currently recorded voice shortcut is judged to be one of the voice shortcuts of the intelligent home robot.
Optionally, the robot body further has a camera;
the robot body comprises a head, a neck and a movable chassis, and the camera is mounted on the head;
when the voice shortcut is the first preset content, the execution unit 96 is specifically configured to adjust a shooting angle aligned with the camera by controlling the movement of the mobile chassis and controlling the neck to realize nodding and shaking of the head when the currently recorded voice shortcut is judged to be one of the voice shortcuts of the smart home robot, and output and display the shooting content through the display.
In summary, the invention implants a programmed program into the intelligent home robot in advance, so that when a menu editing request sent by a user is received, a function menu interface currently stored in the intelligent home robot can be displayed on the display screen, and a user-defined voice shortcut is set for each function in the function menu interface according to user operation, thereby enabling different clients to combine own language habits and actual needs to define the voice shortcuts, meeting personalized voice shortcut control requirements of different clients, and realizing more intelligentization of human-computer interaction.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the microcontroller disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is simple, and the relevant points can be referred to the description of the method part.
In this document, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, identical element in a process, method, article, or apparatus that comprises the element.
Those of skill in the art would understand that information, messages, and signals may be represented using any of a variety of different technologies and techniques. For example, the messages and information mentioned in the above description can be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or any combination thereof.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the embodiments. Thus, the present embodiments are not intended to be limited to the embodiments shown herein but are to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. The function menu customization method is applied to an intelligent household robot, the intelligent household robot is provided with a microphone and a display screen, and the function menu customization method comprises the following steps:
displaying a function menu interface currently stored in the intelligent household robot on the display screen when a menu editing request sent by a user is received;
setting a user-defined voice shortcut for each function in the function menu interface according to user operation;
and saving the modified function menu interface.
2. The method of claim 1, wherein in the modified function menu interface, a same voice shortcut corresponds to one or more functions in the function menu interface.
3. The method of claim 1 or 2, wherein each function corresponds to one or more voice shortcuts in the modified function menu interface.
4. A voice shortcut control method is applied to an intelligent home robot, a function menu interface generated according to the function menu customization method of claim 1, 2 or 3 is stored in the intelligent home robot, and the voice shortcut control method comprises the following steps:
capturing a voice shortcut recorded by a user by using a microphone;
judging whether the currently recorded voice shortcut is one of the voice shortcuts of the intelligent household robot;
if yes, starting a function corresponding to the currently recorded voice control instruction.
5. The voice shortcut control method according to claim 4, wherein the smart home robot further has a camera;
the robot body of the intelligent household robot comprises a head, a neck and a movable chassis, and the camera is mounted on the head;
when the voice shortcut is the first preset content, starting a function corresponding to the currently recorded voice control instruction, including: the shooting angle aligned with the camera is adjusted by controlling the movement of the movable chassis and the nodding and shaking of the head by controlling the neck, and the shot content is output and displayed by the display.
6. The utility model provides an intelligent household robot, its characterized in that, includes robot body and microcontroller, wherein:
the robot body is provided with a microphone and a display screen;
the microcontroller comprises:
the calling unit is used for displaying a function menu interface currently stored in the intelligent household robot on the display screen when a menu editing request sent by a user is received;
the human-computer interaction unit is used for setting a user-defined voice shortcut for each function in the function menu interface according to user operation;
and the storage unit is used for storing the modified function menu interface.
7. The smart home robot of claim 6, wherein the same voice shortcut corresponds to one or more functions in the modified function menu interface stored in the storage unit.
8. The smart home robot of claim 6 or 7, wherein each function corresponds to one or more voice shortcuts in the modified function menu interface stored in the storage unit.
9. The smart home robot of claim 6, wherein the microcontroller further comprises:
the voice recording unit is used for capturing a voice shortcut recorded by a user by a microphone;
the judging unit is used for judging whether the currently recorded voice shortcut is one of the voice shortcuts of the intelligent home robot;
and the execution unit is used for starting a function corresponding to the currently recorded voice control instruction when the currently recorded voice shortcut is judged to be one of the voice shortcuts of the intelligent home robot.
10. The smart home robot of claim 9, wherein the robot body further has a camera;
the robot body comprises a head, a neck and a movable chassis;
the camera is mounted at the head of the robot body;
when the voice shortcut is the first preset content, the execution unit is specifically used for adjusting the shooting angle aligned with the camera by controlling the movement of the mobile chassis and controlling the neck to realize the nodding and shaking of the head when judging that the currently recorded voice shortcut is one of the voice shortcuts of the intelligent home robot, and outputting and displaying the shot content through the display.
CN202010043201.5A 2020-01-15 2020-01-15 Function menu customization method, voice shortcut control method and robot Pending CN111261158A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010043201.5A CN111261158A (en) 2020-01-15 2020-01-15 Function menu customization method, voice shortcut control method and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010043201.5A CN111261158A (en) 2020-01-15 2020-01-15 Function menu customization method, voice shortcut control method and robot

Publications (1)

Publication Number Publication Date
CN111261158A true CN111261158A (en) 2020-06-09

Family

ID=70948923

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010043201.5A Pending CN111261158A (en) 2020-01-15 2020-01-15 Function menu customization method, voice shortcut control method and robot

Country Status (1)

Country Link
CN (1) CN111261158A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115580676A (en) * 2022-08-26 2023-01-06 珠海格力电器股份有限公司 Equipment control method and device, storage medium and intelligent terminal

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1983160A (en) * 2005-12-13 2007-06-20 台达电子工业股份有限公司 Module and its method for self-setting acoustically-controlled fast mode of user
CN103632669A (en) * 2012-08-20 2014-03-12 上海闻通信息科技有限公司 A method for a voice control remote controller and a voice remote controller
CN104575504A (en) * 2014-12-24 2015-04-29 上海师范大学 Method for personalized television voice wake-up by voiceprint and voice identification
CN105488032A (en) * 2015-12-31 2016-04-13 杭州智蚁科技有限公司 Speech recognition input control method and system
CN106550132A (en) * 2016-10-25 2017-03-29 努比亚技术有限公司 A kind of mobile terminal and its control method
CN106847281A (en) * 2017-02-26 2017-06-13 上海新柏石智能科技股份有限公司 Intelligent household voice control system and method based on voice fuzzy identification technology
CN206833230U (en) * 2017-04-18 2018-01-02 青岛有屋科技有限公司 A kind of Intelligent household voice control system of achievable man-machine interaction
CN108521355A (en) * 2018-02-27 2018-09-11 青岛海尔科技有限公司 Method, intelligent terminal, household appliance and the device of self-defined voice control device
CN108597510A (en) * 2018-04-11 2018-09-28 上海思依暄机器人科技股份有限公司 a kind of data processing method and device
CN108683574A (en) * 2018-04-13 2018-10-19 青岛海信智慧家居系统股份有限公司 A kind of apparatus control method, server and intelligent domestic system
CN108768800A (en) * 2018-05-03 2018-11-06 上海思依暄机器人科技股份有限公司 A kind of intelligent gateway
CN108831469A (en) * 2018-08-06 2018-11-16 珠海格力电器股份有限公司 Voice command customizing method, device and equipment and computer storage medium
CN109584875A (en) * 2018-12-24 2019-04-05 珠海格力电器股份有限公司 Voice equipment control method and device, storage medium and voice equipment
CN109683771A (en) * 2018-12-19 2019-04-26 努比亚技术有限公司 Three-dimensional touch menu configuration method, mobile terminal and computer readable storage medium
CN109754804A (en) * 2019-02-21 2019-05-14 珠海格力电器股份有限公司 Voice control method and device, storage medium and intelligent home system
CN109976616A (en) * 2019-03-29 2019-07-05 维沃移动通信有限公司 A kind of shortcut generation method and mobile terminal
CN110086996A (en) * 2019-05-17 2019-08-02 深圳创维-Rgb电子有限公司 A kind of automatic photographing method based on TV, TV and storage medium
CN110136704A (en) * 2019-04-03 2019-08-16 北京石头世纪科技股份有限公司 Robot voice control method and device, robot and medium
CN110462647A (en) * 2017-03-27 2019-11-15 三星电子株式会社 The method of the function of electronic equipment and execution electronic equipment

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1983160A (en) * 2005-12-13 2007-06-20 台达电子工业股份有限公司 Module and its method for self-setting acoustically-controlled fast mode of user
CN103632669A (en) * 2012-08-20 2014-03-12 上海闻通信息科技有限公司 A method for a voice control remote controller and a voice remote controller
CN104575504A (en) * 2014-12-24 2015-04-29 上海师范大学 Method for personalized television voice wake-up by voiceprint and voice identification
CN105488032A (en) * 2015-12-31 2016-04-13 杭州智蚁科技有限公司 Speech recognition input control method and system
CN106550132A (en) * 2016-10-25 2017-03-29 努比亚技术有限公司 A kind of mobile terminal and its control method
CN106847281A (en) * 2017-02-26 2017-06-13 上海新柏石智能科技股份有限公司 Intelligent household voice control system and method based on voice fuzzy identification technology
CN110462647A (en) * 2017-03-27 2019-11-15 三星电子株式会社 The method of the function of electronic equipment and execution electronic equipment
CN206833230U (en) * 2017-04-18 2018-01-02 青岛有屋科技有限公司 A kind of Intelligent household voice control system of achievable man-machine interaction
CN108521355A (en) * 2018-02-27 2018-09-11 青岛海尔科技有限公司 Method, intelligent terminal, household appliance and the device of self-defined voice control device
CN108597510A (en) * 2018-04-11 2018-09-28 上海思依暄机器人科技股份有限公司 a kind of data processing method and device
CN108683574A (en) * 2018-04-13 2018-10-19 青岛海信智慧家居系统股份有限公司 A kind of apparatus control method, server and intelligent domestic system
CN108768800A (en) * 2018-05-03 2018-11-06 上海思依暄机器人科技股份有限公司 A kind of intelligent gateway
CN108831469A (en) * 2018-08-06 2018-11-16 珠海格力电器股份有限公司 Voice command customizing method, device and equipment and computer storage medium
CN109683771A (en) * 2018-12-19 2019-04-26 努比亚技术有限公司 Three-dimensional touch menu configuration method, mobile terminal and computer readable storage medium
CN109584875A (en) * 2018-12-24 2019-04-05 珠海格力电器股份有限公司 Voice equipment control method and device, storage medium and voice equipment
CN109754804A (en) * 2019-02-21 2019-05-14 珠海格力电器股份有限公司 Voice control method and device, storage medium and intelligent home system
CN109976616A (en) * 2019-03-29 2019-07-05 维沃移动通信有限公司 A kind of shortcut generation method and mobile terminal
CN110136704A (en) * 2019-04-03 2019-08-16 北京石头世纪科技股份有限公司 Robot voice control method and device, robot and medium
CN110086996A (en) * 2019-05-17 2019-08-02 深圳创维-Rgb电子有限公司 A kind of automatic photographing method based on TV, TV and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115580676A (en) * 2022-08-26 2023-01-06 珠海格力电器股份有限公司 Equipment control method and device, storage medium and intelligent terminal

Similar Documents

Publication Publication Date Title
US9579790B2 (en) Apparatus and methods for removal of learned behaviors in robots
CN104110787B (en) Method and system for controlling air conditioner
US20160075034A1 (en) Home animation apparatus and methods
CN108717270A (en) Control method and device of intelligent equipment, storage medium and processor
CN109616111B (en) Scene interaction control method based on voice recognition
WO2018201695A1 (en) Device control method, apparatus, system, and virtual reality device
US10359993B2 (en) Contextual user interface based on environment
CN111935583B (en) Earphone mode control method, earphone mode control device, terminal equipment, earphone mode control system and storage medium
CN111123851A (en) Method, device and system for controlling electric equipment according to user emotion
CN111261158A (en) Function menu customization method, voice shortcut control method and robot
JP2019124855A (en) Apparatus and program and the like
CN109711282A (en) Light adjusting method and device
CN106113053A (en) A kind of robot of long-range control
CN114339340A (en) Display device, terminal and image acquisition method
CN111198510A (en) Function menu customization method, remote controller shortcut key control method and robot
CN108705538A (en) A kind of child growth intelligent robot and its control method
CN102033495A (en) Curtain control system and curtain control method
TW519826B (en) Personality-based intelligent camera system
CN111556198A (en) Sound effect control method, terminal equipment and storage medium
CN205880488U (en) Intelligent all -in -one of ball curtain curved surface reproduction
CN108901104A (en) Method, controller, control device, system and the storage medium of controlled by sound and light
CN109408164A (en) Control method, apparatus, equipment and the readable storage medium storing program for executing of screen display content
US20190080458A1 (en) Interactive observation device
CN111629287A (en) Projection body module, intelligent sound box, information display method and storage medium
CN110888328A (en) Method and device for automatically controlling switching of multiple working modes of smart home

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200609

RJ01 Rejection of invention patent application after publication