CN112890681A - Voice control method, Bluetooth headset, cleaning robot and control system - Google Patents

Voice control method, Bluetooth headset, cleaning robot and control system Download PDF

Info

Publication number
CN112890681A
CN112890681A CN201911137367.7A CN201911137367A CN112890681A CN 112890681 A CN112890681 A CN 112890681A CN 201911137367 A CN201911137367 A CN 201911137367A CN 112890681 A CN112890681 A CN 112890681A
Authority
CN
China
Prior art keywords
cleaning robot
navigation message
bluetooth headset
navigation
address
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911137367.7A
Other languages
Chinese (zh)
Inventor
许登科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Amicro Semiconductor Co Ltd
Original Assignee
Zhuhai Amicro Semiconductor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Amicro Semiconductor Co Ltd filed Critical Zhuhai Amicro Semiconductor Co Ltd
Priority to CN201911137367.7A priority Critical patent/CN112890681A/en
Publication of CN112890681A publication Critical patent/CN112890681A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B5/00Near-field transmission systems, e.g. inductive or capacitive transmission systems
    • H04B5/70Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes
    • H04B5/72Near-field transmission systems, e.g. inductive or capacitive transmission systems specially adapted for specific purposes for local intradevice communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W8/00Network data management
    • H04W8/005Discovery of network devices, e.g. terminals
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Electric Vacuum Cleaner (AREA)

Abstract

The invention discloses a voice control method, a Bluetooth headset, a cleaning robot and a control system, wherein the voice control method comprises the following steps: controlling the cleaning robot to search an ip address of the Bluetooth headset, and then controlling the cleaning robot to receive navigation messages at intervals, wherein the navigation messages are the results of analyzing and packaging a voice command sent by a user by the Bluetooth headset; setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message after the cleaning robot receives the navigation message once, and calculating a displacement vector between the cleaning robot and the sub-target positions; and controlling the cleaning robot to advance along the working path corresponding to the displacement vector, and executing the planning type cleaning operation after the cleaning robot judges that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message. The efficiency and the intelligent degree of the user for issuing the voice command to the cleaning robot are improved.

Description

Voice control method, Bluetooth headset, cleaning robot and control system
Technical Field
The invention belongs to the field of intelligent household appliances, and particularly relates to a voice control method based on Bluetooth cleaning robot interconnection, a Bluetooth headset, a cleaning robot and a control system based on Bluetooth cleaning robot interconnection.
Background
The cleaning robot gradually becomes a household appliance frequently used by a common household, in the application occasions of the prior art, the planning type cleaning robot generally adopts planning type cleaning, but a user sometimes feels that cleaning is needed in the current position area, and hopes that the cleaning robot can work at the target position immediately so as to meet the intelligent requirement of the user on the cleaning robot.
Disclosure of Invention
The present invention is proposed in view of the intelligent requirements generated by the prior art, and the technical scheme is as follows:
a voice control method based on Bluetooth cleaning robot interconnection is characterized in that a user establishes wireless interconnection with a cleaning robot through a Bluetooth headset worn by the user, and the voice control method comprises the following steps: controlling the cleaning robot to search an ip address of the Bluetooth headset, and then controlling the cleaning robot to receive navigation messages at intervals, wherein the navigation messages are the results of analyzing and packaging a voice command sent by a user by the Bluetooth headset; setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message after the cleaning robot receives the navigation message once, and calculating a displacement vector between the cleaning robot and the sub-target positions; each sub-target position is a position in a circle with the preset distance as the radius and the current position of a user wearing the Bluetooth headset as the center; and controlling the cleaning robot to advance along the working path corresponding to the displacement vector, and executing the planning type cleaning operation after the cleaning robot judges that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the physical address of the navigation message. According to the technical scheme, command communication contact of the user and the cleaning robot in batches is established in the shared network established by the Bluetooth headset, the intelligent degree of the user for issuing voice instructions to the cleaning robot is improved, the cleaning robot can walk to the user more stably and gradually, and therefore the user voice command cleaning operation experience is met.
Furthermore, the ip address contained in the navigation message continuously sent by the bluetooth headset and the ip address of the bluetooth headset belong to the same local area network segment, and the ip addresses are all pre-configured in the same local area network segment and are matched with the position coordinates in the effective broadcasting area of the bluetooth headset; the cleaning robot sends a response signal to the Bluetooth headset after receiving the navigation message every time, so that the Bluetooth headset adjusts the ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses; the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time. According to the technical scheme, the Bluetooth local area network is preferentially established, so that the communication connection between the Bluetooth headset and the cleaning robot is established in the Bluetooth local area network, the effective ip address for the cleaning robot to navigate is distributed, and the cleaning robot can be planned to the target position matched with the effective ip address and the signal strength after receiving the navigation message every time. The intelligent and convenient.
Further, the navigation coordinate system in the current construction map of the cleaning robot and the coordinate axis direction of the coordinate system where the sub-target positions are located are kept the same, and when the cleaning robot turns, the coordinate system where the sub-target positions are located makes a transformation along with the navigation coordinate system of the cleaning robot. The technical scheme reduces the problem of direction disorder in the same local area network area by keeping real-time coordinate system transformation, and simultaneously facilitates the calculation of displacement vectors.
Furthermore, the navigation message comprises a wake-up instruction and a task instruction which are analyzed by a primary voice instruction sent by a user; the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to enter a working state or suspend execution of a current cleaning task from a standby state, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message; and the task instruction is used for setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the distance of the displacement vector in the X-axis direction and the distance of the displacement vector in the Y-axis direction, and then controlling the cleaning robot to advance along the working path corresponding to the displacement vector. According to the technical scheme, the cleaning robot is started first, and then commands are issued to the cleaning robot, so that the guiding control effect on the cleaning robot is realized.
Further, after the displacement vector between the cleaning robot and the sub-target position is calculated, according to the deviation angle of the sub-target position included in the displacement vector relative to the current orientation of the cleaning robot, controlling the steering movement of the cleaning robot and the coordinate transformation of the coordinate system where the sub-target position is located; after the cleaning robot turns, the coordinate system where the sub-target positions are located is transformed along with the navigation coordinate system of the cleaning robot, so that the navigation coordinate system in the real-time map constructed by the cleaning robot is the same as the coordinate axis direction of the coordinate system where the sub-target positions are located. The method is beneficial to simplifying the middle coordinate analysis operation flow in the navigation process, improving the position judgment efficiency of the robot and further improving the response capability of the robot navigation.
Further, when the cleaning robot travels to a wall or an obstacle along the working path corresponding to the displacement vector, the cleaning robot is controlled to travel along the edge until the latest one-time navigation message is received, and if the cleaning robot is blocked by other obstacles in the process of traveling along the edge obstacle, a user is prompted through voice broadcast. The cleaning robot can effectively get rid of the trip of obstacles, and is beneficial to efficiently cleaning.
A Bluetooth headset comprises a first microphone, a command packaging module and a broadcasting module, wherein the first microphone, the command packaging module and the broadcasting module are arranged below a headset shell; the first microphone is used for acquiring voice information sent by a user and analyzing the voice information into a voice instruction; the command packaging module is used for packaging the voice command analyzed by the first microphone into a navigation message; the broadcasting module is used for broadcasting and sending the navigation message packaged by the command packaging module according to a time interval aiming at a voice command sent by a user; the Bluetooth headset is provided with a database in which voice instructions are stored, and a time interval for broadcasting the navigation message each time. According to the technical scheme, the voice information of the user can be broadcasted, so that the mobile terminal capable of receiving the navigation message within the effective distance is controlled by the voice instruction of the user, and the user experience is improved.
Further, the bluetooth headset further comprises a second microphone, which is closer to the ear of the user than the first microphone, for generating sound waves that cancel the noise collected by the first microphone. The accuracy of the broadcast message of the Bluetooth headset is improved.
Furthermore, the ip address contained in the navigation message continuously sent by the bluetooth headset and the ip address of the bluetooth headset belong to the same local area network segment, and the ip addresses are all pre-configured in the same local area network segment and are matched with the position coordinates in the effective broadcasting area of the bluetooth headset; after receiving the navigation message each time, an external cleaning robot sends a response signal to the Bluetooth headset, so that the Bluetooth headset adjusts the ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses; the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time. According to the technical scheme, the voice information of the user is converted into the local area network broadcast message with the navigation guiding significance, so that the voice information broadcasted by the Bluetooth headset is endowed with the navigation function of the mobile terminal receiving the navigation message in the local area network.
A cleaning robot based on Bluetooth interconnection comprises a message receiving module, a matching positioning module and a navigation module; the message receiving module is used for controlling the cleaning robot to search the ip address of the Bluetooth headset and then controlling the cleaning robot to receive the navigation message broadcast by the broadcasting module at intervals; the matching positioning module is used for setting sub-target positions and calculating a displacement vector between the cleaning robot and the sub-target positions according to the signal intensity of the navigation message and the coordinates matched with the ip address of the navigation message after the message receiving module receives the navigation message once; each sub-target position is a position in a circle with the preset distance as the radius and the current position of a user wearing the Bluetooth headset as the center; and the navigation module is used for controlling the cleaning robot to move forward along a working path corresponding to the displacement vector calculated by the matching positioning module, and controlling the cleaning robot to execute planned cleaning operation after judging that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the physical address of the navigation message. In this technical scheme, cleaning machines people establishes the user with the help of bluetooth headset and establishes the progressive communication command contact to cleaning machines people, improves the efficiency and the intelligent degree that the user assigned voice command to cleaning machines people, lets cleaning machines people's path planning more stable, improves user experience.
Further, after the displacement vector between the cleaning robot and the sub-target position is calculated, the navigation module is configured to control the steering movement of the cleaning robot and the coordinate transformation of the coordinate system in which the sub-target position is located according to a deviation angle, included in the displacement vector, of the sub-target position relative to the current orientation of the cleaning robot; after the cleaning robot turns, the coordinate system where the sub-target positions are located is transformed along with the navigation coordinate system of the cleaning robot, so that the navigation coordinate system in the real-time map constructed by the cleaning robot is the same as the coordinate axis direction of the coordinate system where the sub-target positions are located. The method is beneficial to simplifying the middle coordinate analysis operation flow in the navigation process, improving the position judgment efficiency of the robot and further improving the response capability of the robot navigation.
Furthermore, the navigation message comprises a wake-up instruction and a task instruction which are analyzed based on a primary voice instruction sent by a user; the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to enter a working state or suspend execution of a current cleaning task from a standby state, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message; and the task instruction is used for setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the distance of the displacement vector in the X-axis direction and the distance of the displacement vector in the Y-axis direction, and then controlling the cleaning robot to advance along the working path corresponding to the displacement vector. The cleaning robot sends a response signal to a Bluetooth headset after receiving the navigation message every time, so that the Bluetooth headset adjusts an ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses; the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time. According to the technical scheme, the cleaning robot is started firstly, then a command is issued to the cleaning robot, and the guiding control effect on the cleaning robot is gradually realized according to the ip address and the signal intensity characteristics of the navigation message, so that the voice control habit of a user is adapted.
A control system based on interconnection of Bluetooth cleaning robots comprises the Bluetooth headset and the cleaning robot. The system is intelligent and convenient, and provides more convenience for home life.
Drawings
Fig. 1 is a basic flowchart of a bluetooth-based voice control method for interconnection of cleaning robots according to an embodiment of the present invention.
Fig. 2 is a virtual module framework diagram of a control system based on interconnection of bluetooth cleaning robots according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described in detail below with reference to the accompanying drawings in the embodiments of the present invention.
In order to meet the intelligent control requirement of a user on a cleaning robot and achieve the voice control effect of calling, the embodiment of the invention provides a voice control method based on Bluetooth cleaning robot interconnection, wherein the user establishes wireless interconnection with the cleaning robot through a Bluetooth headset worn by the user. As shown in fig. 1, the voice control method includes: and step S1, controlling the cleaning robot to search the ip address of the Bluetooth headset, controlling the cleaning robot to receive navigation messages at intervals, and then entering step S2. The navigation message is the result of analyzing and packaging a voice command sent by a user by the Bluetooth headset.
Specifically, the step S1 specifically includes: step S11, after the cleaning robot is started after being powered on, a built-in wifi module of the cleaning robot can automatically search an ip address sent by a Bluetooth headset in an external broadcast mode, and then the step S12 is carried out. Step S12, when the user sends out a voice which needs the machine to clean the current area once, the Bluetooth headset analyzes the voice sent out by the user into a voice command, and then the step S13 is executed; step S13, the Bluetooth headset packages the voice command analyzed in the step S12 into a navigation message, and then the step S14 is carried out; step S14, controlling the Bluetooth headset to broadcast and send a navigation message packaged in advance, and then entering step S15; step S15, controlling the cleaning robot to keep searching ip addresses contained in the navigation messages broadcast and sent by the Bluetooth headset to the outside, after receiving the navigation messages, the cleaning robot sends a response signal based on the navigation messages under the current ip addresses to the Bluetooth headset, and then the step S16 is carried out; step S16, the Bluetooth headset adjusts the ip address of the navigation message broadcast and sent after a time interval, or the Bluetooth headset keeps the navigation message broadcast and sent after a time interval unchanged, but the navigation message comprises different ip addresses; and then returns to step S14, thereby implementing control of the cleaning robot to receive the navigation message at intervals.
And step S2, after the cleaning robot receives the navigation message once, setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the displacement vector between the cleaning robot and the sub-target positions, and then entering step S3. Each sub-target position is a position in a circle with the preset distance as a radius and the current position of a user wearing the Bluetooth headset is taken as a center, and the sub-target positions are all positioned in an effective broadcast area of the Bluetooth headset and are associated with the strength of a received message signal. Preferably, the ip address of the bluetooth headset is pre-allocated in the same lan, the ip address contained in the navigation message continuously sent by the bluetooth headset and the ip address of the bluetooth headset belong to the same lan segment, and the ip addresses are pre-configured in the same lan segment and are matched with the position coordinates in the effective broadcast area of the bluetooth headset; meanwhile, marking the matched position coordinates in a map constructed by the cleaning robot in real time, wherein the position coordinates correspond to the coordinates of the sub-target positions; the local area network is a personal sharing network constructed by bluetooth headsets, and a certain effective broadcast area exists, and in this embodiment, the start position of the cleaning robot needs to be set in the effective broadcast area.
On the basis of the foregoing step S1 and step S2, the cleaning robot sends a response signal to the bluetooth headset after receiving the navigation message each time, so that the bluetooth headset adjusts the ip address of the navigation message broadcast and sent after a time interval, or the bluetooth headset does not change the navigation message broadcast and sent after a time interval but the navigation message includes different ip addresses. The response signal carries signal strength information of the navigation message, wherein the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time, and no matter only one ip address in one message is extracted or a matched ip address is extracted from one message. In this embodiment, the bluetooth local area network is preferentially established, so that the communication connection between the bluetooth headset and the cleaning robot is established in the bluetooth local area network, the effective ip address for the navigation of the cleaning robot is allocated, the cleaning robot is guaranteed to be capable of planning to the target position matched with the effective ip address and the message signal strength after receiving the navigation message every time, and the moving path of the cleaning robot is regular and more stable.
When the Bluetooth headset needs to adjust the ip address of the navigation message broadcast and sent after the time interval, the adjusted ip address is an ip address which is adjusted according to the response signal and is matched with the position coordinate in the effective broadcast area, the adjusted ip address is also associated with the position coordinate of the cleaning robot at the next sub-target, the position of each sub-target planned to move by the cleaning robot is arranged in the effective broadcast area, the sub-target position and the adjusted ip address exist a one-to-one mapping matching relationship, the coordinate value of the sub-target position is from the position coordinate in the effective broadcast area, and the position coordinate in the effective broadcast area and the ip address of the navigation message exist a one-to-one mapping matching relationship; the cleaning robot is located at different sub-target positions each time the cleaning robot receives a navigation message, and the signal strength of the navigation message received by each sub-target position is different, so that the received message signal strength information needs to be added to the response signal, which is equivalent to that the cleaning robot feeds back relative distance information to the bluetooth headset, and the control method is favorable for commanding the sub-target position of the cleaning robot to move next time by adjusting the ip address of the navigation message sent by broadcasting, so that the bluetooth headset adjusts the sub-target position matched with the ip address sent by broadcasting and tends to the direction planning of the cleaning robot receiving the navigation message with enhanced signal strength.
When the Bluetooth headset does not need to adjust the ip address of the navigation message broadcast and sent after one time interval, the Bluetooth headset repeatedly broadcasts and sends the same navigation message at intervals, and the navigation message contains different ip addresses; after the cleaning robot receives the navigation message every time, according to the signal intensity of the navigation message received by the body, extracting a matched ip address from the navigation message, and using the ip address as a basis for planning the sub-target position of the cleaning robot for next walking. In this embodiment, the ip address extracted by the cleaning robot each time is an ip address matched with the position coordinates in the effective broadcast area, each sub-target position planned to move by the cleaning robot is set in the effective broadcast area, and the sub-target positions and the ip addresses extracted each time have a one-to-one mapping matching relationship. And the signal strength of the navigation message received by each sub-target position is different, and the sub-target position matched with the ip address extracted by the cleaning robot tends to the position planning with the enhanced signal strength of the navigation message. And adding the received message signal strength information when the cleaning robot sends a response signal to the Bluetooth headset every time, which is equivalent to that the cleaning robot feeds back relative distance information to the Bluetooth headset.
For step S2, after calculating the displacement vector between the cleaning robot and the sub-target position, when the cleaning robot performs a steering operation using the displacement vector, the coordinate system of the sub-target position set in the previous step performs a translation and rotation transformation following the navigation coordinate system of the cleaning robot, so as to ensure that the directions of the coordinate axes of the navigation coordinate system in the currently constructed map of the cleaning robot and the coordinate system of the sub-target position are the same. Specifically, according to the deviation angle of the sub-target position included in the displacement vector relative to the current orientation of the cleaning robot, the present embodiment controls the steering movement of the cleaning robot so as to be better paired with the bluetooth headset, meanwhile substitutes the deviation angle into a conventional coordinate system change matrix formula, and calculates and obtains the coordinate value of the transformed coordinate system of the sub-target position by combining with the positioning coordinate of the cleaning robot in the navigation coordinate system of the real-time map construction, so that the robot can conveniently perform positioning and map position marking work in the movement process along the displacement vector, and the reaction capability of the robot navigation is improved. In this embodiment, by executing the step S2, the coordinate system where the continuously updated sub-target positions are located is translated and rotated to be the navigation coordinate system in the currently constructed map of the cleaning robot, and complex comparison and analysis are not required, so that the calculation resources of the robot are greatly simplified, the calculation efficiency and the determination speed of the robot are improved, and the problem of disorientation in the navigation of the robot in the same local area network area is also reduced.
Step S3, controlling the cleaning robot to move forward along the working path corresponding to the displacement vector calculated in the step S2, and receiving the navigation message once when the cleaning robot reaches one sub-target position; when the step is executed, step S1 receives the navigation message for multiple times, and obtains coordinate values of the sub-target positions matched with multiple ip addresses, and at this time, if the time for the cleaning robot to walk along the displacement vector is longer than the time interval, the step S2 is returned again to set a new sub-target position, and a new displacement vector is obtained through calculation; or, when the cleaning robot reaches one sub-target position, the step S1 only receives the navigation message once, and only obtains the coordinate value of the sub-target position matched with the ip address, and at this time, the walking time of the cleaning robot along the displacement vector is equal to the time interval, and then the step S1 is returned to continue to receive a new navigation message once.
In the step S3, in the process that the cleaning robot moves forward along the working path corresponding to the displacement vector, it is determined in real time whether the cleaning robot enters the current position of the user according to the signal strength of the navigation packet and the coordinate matched with the ip address of the navigation packet, that is, whether the signal strength of the navigation packet received by the cleaning robot at the current position is within a preset strength range value and the coordinate matched with the ip address is at a sub-target position, if both are satisfied, the cleaning robot determines that the cleaning robot enters the current position of the user, and then the cleaning robot is controlled to start to perform a planned cleaning operation, otherwise, the cleaning robot continues to move forward along the working path corresponding to the displacement vector to move to the newly set sub-target position.
In the foregoing step, the navigation packet carries an occurrence timestamp, signal strength information, and a serial number of the sending device. The navigation message also comprises a wake-up instruction and a task instruction which are analyzed by a voice instruction sent by a user once; the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to switch from a standby state to a working state or suspending execution of a current cleaning task, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message; the task instruction is used for setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message, and calculating a displacement vector between the current marked position of the cleaning robot in a map and the set sub-target positions, wherein the displacement vector specifically comprises the distance of the displacement vector in the X-axis direction, the distance of the displacement vector in the Y-axis direction and vector angle information; and controlling the cleaning robot to advance along the working path corresponding to the displacement vector. For example, when a user wearing the bluetooth headset sends a voice of "one machine", the bluetooth headset generates a wake-up command accordingly, and then when the user sends a voice of "cleaning at this place", the bluetooth headset generates task commands accordingly, and broadcasts and sends the commands to the cleaning robot named as "one machine". Therefore, the cleaning robot is started first through the voice command of the user transmitted by the Bluetooth headset, and then a command is issued to the cleaning robot, so that the guiding control effect on the cleaning robot is realized.
Preferably, when the cleaning robot travels to a wall or an obstacle along the working path corresponding to the displacement vector, the cleaning robot is controlled to walk along the edge, namely, around the obstacle until the cleaning robot reaches one of the sub-target positions, the latest one-time navigation message is received, if the cleaning robot is blocked by other obstacles in the process of walking along the edge around the obstacle and the driving motor is blocked so that the cleaning robot cannot operate, the cleaning robot prompts a user through voice broadcast, and after receiving the prompting voice through the bluetooth headset, the user goes to the corresponding coordinate position to process the trapped cleaning robot. The cleaning robot can effectively get rid of the trip of obstacles, and is beneficial to efficiently cleaning.
In summary, the present embodiment establishes the command communication contact between the user and the cleaning robot in batches in the shared network established by the bluetooth headset, thereby improving the intelligent degree of the user issuing the voice command to the cleaning robot, and making the cleaning robot walk to the user more stably and gradually, so as to satisfy the experience of the user in commanding the cleaning operation by voice.
In order to implement the above embodiment, the present invention further provides a bluetooth headset, as shown in fig. 2, including a first microphone, a command packing module, and a broadcasting module, which are disposed under a shell of the headset; the first microphone is used for acquiring voice information sent by a user and analyzing the voice information into a voice instruction, wherein the analysis is to convert an analog voice signal into a discrete digital instruction command through an analog-to-digital converter; the command packaging module is used for packaging a voice command which is sent by a user once and analyzed by the first microphone, and packaging the voice command into a navigation message; the broadcasting module is used for broadcasting and sending the navigation message packaged by the command packaging module according to time intervals aiming at the voice command sent by the user at one time; the inside memory of bluetooth headset stores the database of voice command to and the time interval of broadcasting the navigation message every time that the configuration is good, and the database of voice command includes the matching table of user's pronunciation and command instruction. According to the embodiment, the voice information of the user can be broadcasted, so that the mobile terminal capable of receiving the navigation message within the effective distance is controlled by the voice instruction of the user, and the user experience is improved.
Preferably, the bluetooth headset further comprises a second microphone, the second microphone being closer to the ear of the user than the first microphone for generating sound waves that cancel noise picked up by the first microphone. The second microphone performs enhancement processing on the sound in the user direction by adjusting the time delay compensation weight of the beam former, and performs attenuation processing on the sound in the non-user direction, and particularly, the noise transmitted from the outside is received by the second microphone and then transmitted to the acoustic measuring instrument in the Bluetooth headset for noise spectrum analysis, so that sound waves with opposite phases and the same amplitude as the noise are generated to offset the noise, and the accuracy of the broadcast message of the Bluetooth headset is improved.
Preferably, the ip address of the bluetooth headset is pre-allocated in the same local area network, and the ip address contained in the navigation message continuously sent by the bluetooth headset and the ip address of the bluetooth headset belong to the same local area network segment; the ip addresses are all configured in advance in the same local area network segment and are matched with position coordinates in an effective broadcasting area of the Bluetooth headset, and a matching table of the ip addresses and the position coordinates is stored in the cleaning robot. The cleaning robot can send a response signal to the Bluetooth headset after receiving the navigation message every time, so that the Bluetooth headset adjusts the ip address of the navigation message sent by broadcasting after the time interval, or the Bluetooth headset does not change the navigation message sent by broadcasting after the time interval but comprises different ip addresses. The response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time. In this embodiment, the voice information of the user is converted into a local area network broadcast message with a navigation guidance significance, so that the voice information broadcasted by the bluetooth headset is endowed with a navigation function of a mobile terminal receiving the navigation message in the local area network, and particularly, a new ip address for navigation is planned according to the fed-back message signal strength information.
In order to implement the above embodiments, the present invention further provides a cleaning robot based on bluetooth interconnection, and as shown in fig. 2, the cleaning robot includes a message receiving module, a matching positioning module, and a navigation module.
The message receiving module is used for controlling the cleaning robot to search the ip address of the Bluetooth headset and then controlling the cleaning robot to receive the navigation message broadcast by the broadcasting module at intervals; the navigation message is a result of analyzing and packaging a voice command sent by a user according to the Bluetooth headset; after receiving the navigation message each time, the cleaning robot sends a response signal to the bluetooth headset through the message receiving module, so that the bluetooth headset adjusts the ip address of the navigation message broadcast and sent after one time interval, or the bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message includes different ip addresses.
The matching positioning module is used for setting sub-target positions and calculating a displacement vector between the cleaning robot and the sub-target positions according to the signal intensity of the navigation message and the coordinates matched with the ip address of the navigation message after the message receiving module receives the navigation message once; each sub-target position is a position in a circle with the preset distance as the radius and the current position of a user wearing the Bluetooth headset as the center; each sub-target position planned to move by the cleaning robot is arranged in an effective broadcast area, the sub-target positions and the adjusted ip address at each time have a one-to-one mapping matching relationship, the coordinate values of the sub-target positions come from the position coordinates in the effective broadcast area, the position coordinates in the effective broadcast area and the ip address of the navigation message have a one-to-one mapping matching relationship, and the sub-target positions matched with the planned ip address identified by the cleaning robot tend to the signal intensity enhancing direction of the navigation message.
And the navigation module is used for controlling the cleaning robot to move forward along a working path corresponding to the displacement vector calculated by the matching positioning module, and controlling the cleaning robot to execute planned cleaning operation after judging that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the physical address of the navigation message. In the process of advancing along the working path corresponding to the displacement vector, the navigation module judges whether the cleaning robot enters the current position of a user in real time according to the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message, namely, whether the signal intensity of the navigation message received by the cleaning robot at the current position reaches a preset intensity range value and whether the coordinate matched with the ip address is located at a sub-target position, if yes, the cleaning robot confirms that the cleaning robot enters the current position of the user, then the cleaning robot is controlled to start to execute planned cleaning operation, and if not, the cleaning robot continues to advance along the working path corresponding to the displacement vector output by the matching positioning module.
When the walking time of the cleaning robot along the displacement vector is longer than the time interval, the message receiving module receives the navigation messages for multiple times and obtains coordinate values of sub-target positions matched with multiple ip addresses, and the navigation module continuously searches for new sub-target positions by adopting the displacement vector calculated by the matching positioning module; and when the walking time of the cleaning robot along the displacement vector is equal to the time interval, the message receiving module receives the navigation message only once and only acquires the coordinate value of the sub-target position matched with the ip address, and the message receiving module is controlled to continue to receive the new navigation message once.
The cleaning robot formed by the message receiving module, the matching positioning module and the navigation module establishes progressive communication command contact of a user to the cleaning robot by means of the Bluetooth headset, improves the efficiency and the intelligent degree of the user for issuing voice instructions to the cleaning robot, enables the path planning of the cleaning robot to be more stable, and improves the user experience.
Preferably, in the process of calculating the displacement vector between the cleaning robot and the sub-target position, the navigation module is configured to control the steering movement of the cleaning robot so as to better pair with the bluetooth headset according to a deviation angle of the sub-target position included in the displacement vector with respect to the current orientation of the cleaning robot, meanwhile, the deviation angle is substituted into a conventional coordinate system change matrix formula, and then, in combination with the positioning coordinates of the cleaning robot in the navigation coordinate system of the real-time map construction, the coordinate system to which the newly set sub-target position belongs is translated and rotated, so that the directions of coordinate axes of the navigation coordinate system in the current map construction of the cleaning robot and the coordinate system to which the sub-target position belongs are kept the same, and the coordinate values of the sub-target position are transformed by corresponding translation amount and rotation angle, and obtaining the coordinate values of the sub-target positions in a navigation coordinate system in the current constructed map. The specific coordinate translation and rotation can be calculated by referring to the existing coordinate translation and rotation formula, and the details are not repeated here. And then calculating and obtaining coordinate values of the sub-target positions in the transformed coordinate system, the distance of the displacement vector in the X-axis direction and the distance of the displacement vector in the Y-axis direction. The operation efficiency and the judgment speed of the navigation module are improved, the reaction speed of the robot is further improved, and the problem that the navigation of the robot in the same local area network area is easy to cause disorderly direction is also reduced.
Preferably, the navigation message includes a wake-up instruction and a task instruction which are analyzed from a voice instruction sent by a user once; the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to enter a working state or suspend execution of a current cleaning task from a standby state, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message; and the task instruction is used for setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the distance of the displacement vector in the X-axis direction and the distance of the displacement vector in the Y-axis direction, and then controlling the cleaning robot to advance along the working path corresponding to the displacement vector. For example, when a user wearing the bluetooth headset sends a voice of "one machine", the bluetooth headset generates a wake-up command accordingly, and then when the user sends a voice of "cleaning at this place", the bluetooth headset generates task commands accordingly, and broadcasts and sends the commands to the cleaning robot named as "one machine". Therefore, the cleaning robot is started first, then commands are issued to the cleaning robot, the guiding control effect on the cleaning robot is gradually achieved, and the voice control habit of a user is adapted.
It is worth mentioning that the cleaning robot sends a response signal to the bluetooth headset after receiving the navigation message each time, so that the bluetooth headset adjusts the ip address of the navigation message broadcasted after one time interval, or the bluetooth headset does not change the navigation message broadcasted after one time interval but the navigation message includes different ip addresses; the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time. And the cleaning robot executes the guide control of the Bluetooth headset according to the ip address and the signal intensity characteristic of the navigation message.
In order to realize the embodiment, the invention further provides a cleaning robot voice control system based on Bluetooth interconnection, which comprises the Bluetooth headset and the cleaning robot, is intelligent and convenient, and provides more convenience for household life. The functional effects of the system are disclosed in the virtual module, and are not described herein again.
For the above embodiments, for simplicity of description, the embodiments are described as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts described, as some steps may be performed in other orders or simultaneously according to the embodiments of the invention. Furthermore, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.

Claims (13)

1. A voice control method based on Bluetooth cleaning robot interconnection is characterized by comprising the following steps:
controlling the cleaning robot to search an ip address of the Bluetooth headset, and then controlling the cleaning robot to receive navigation messages at intervals, wherein the navigation messages are the results of analyzing and packaging a voice command sent by a user by the Bluetooth headset;
setting sub-target positions according to the signal strength of the navigation message and the coordinates matched with the ip address of the navigation message after the cleaning robot receives the navigation message once, and calculating a displacement vector between the cleaning robot and the sub-target positions;
and controlling the cleaning robot to advance along the working path corresponding to the displacement vector, and executing the planning type cleaning operation after the cleaning robot judges that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message.
2. The voice control method according to claim 1, wherein an ip address included in the navigation message continuously sent by the bluetooth headset and an ip address of the bluetooth headset belong to a same lan segment, and the ip addresses are all pre-configured in the same lan segment and are matched with position coordinates in an effective broadcast area of the bluetooth headset;
the cleaning robot sends a response signal to the Bluetooth headset after receiving the navigation message every time, so that the Bluetooth headset adjusts the ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses;
the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time.
3. The voice control method according to claim 2, wherein the directions of the coordinate axes of the navigation coordinate system in the current constructed map of the cleaning robot and the coordinate system in which the sub-target positions are located are kept the same, and when the cleaning robot turns, the coordinate system in which the sub-target positions are located is transformed along with the navigation coordinate system of the cleaning robot.
4. The voice control method according to claim 3, wherein the navigation message includes a wake-up command and a task command analyzed according to a primary voice command sent by a user;
the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to enter a working state or suspend execution of a current cleaning task from a standby state, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message;
and the task instruction is used for setting sub-target positions according to the signal intensity of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the displacement vector and controlling the cleaning robot to advance along the working path corresponding to the displacement vector.
5. The voice control method according to claim 4, wherein after the calculation of the displacement vector between the cleaning robot and the sub-target position, the turning movement of the cleaning robot and the coordinate transformation of the coordinate system in which the sub-target position is located are controlled according to a deviation angle of the sub-target position included in the displacement vector with respect to the current orientation of the cleaning robot;
after the cleaning robot turns, the coordinate system where the sub-target positions are located is transformed along with the navigation coordinate system of the cleaning robot, so that the navigation coordinate system in the real-time map constructed by the cleaning robot is the same as the coordinate axis direction of the coordinate system where the sub-target positions are located.
6. The voice control method according to claim 4, wherein when the cleaning robot travels to a wall or an obstacle along the working path corresponding to the displacement vector, the cleaning robot is controlled to walk along the edge until a latest navigation message is received, and if the cleaning robot is stuck by other obstacles during the process of walking along the edge obstacle, the user is prompted through voice broadcast.
7. A Bluetooth headset is characterized by comprising a first microphone, a command packaging module and a broadcasting module, wherein the first microphone, the command packaging module and the broadcasting module are arranged below a headset shell; the first microphone is used for acquiring voice information sent by a user and analyzing the voice information into a voice instruction; the command packaging module is used for packaging the voice command analyzed by the first microphone into a navigation message; the broadcasting module is used for broadcasting and sending the navigation message packaged by the command packaging module according to a time interval aiming at a voice command sent by a user; the Bluetooth headset comprises a database for storing voice instructions and a time interval for broadcasting navigation messages each time;
the Bluetooth headset stores a database of voice instructions and a time interval for broadcasting the navigation message each time.
8. The bluetooth headset of claim 7, further comprising a second microphone positioned closer to the user's ear than the first microphone for generating sound waves that cancel noise picked up by the first microphone.
9. The bluetooth headset of claim 7, wherein the ip address contained in the navigation message continuously sent by the bluetooth headset and the ip address of the bluetooth headset belong to the same lan segment, and the ip addresses are pre-configured in the same lan segment and are matched with the position coordinates in the effective broadcast area of the bluetooth headset;
after receiving the navigation message each time, an external cleaning robot sends a response signal to the Bluetooth headset, so that the Bluetooth headset adjusts the ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses;
the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time.
10. A cleaning robot based on Bluetooth interconnection is characterized by comprising a message receiving module, a matching positioning module and a navigation module; a message receiving module, configured to control the cleaning robot to search for the ip address of the bluetooth headset according to any one of claims 7 to 9, and then control the cleaning robot to receive the navigation message broadcast by the broadcasting module at intervals; the navigation message is a result of analyzing and packaging a voice command sent by a user according to the Bluetooth headset;
the matching positioning module is used for setting sub-target positions and calculating a displacement vector between the cleaning robot and the sub-target positions according to the signal intensity of the navigation message and the coordinates matched with the ip address of the navigation message after the message receiving module receives the navigation message once;
and the navigation module is used for controlling the cleaning robot to move forward along a working path corresponding to the displacement vector calculated by the matching positioning module, and controlling the cleaning robot to execute planned cleaning operation after judging that the cleaning robot enters the current position of the user according to the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message.
11. The cleaning robot of claim 10, wherein after calculating the displacement vector between the cleaning robot and the sub-target position, the navigation module is configured to control the steering movement of the cleaning robot and the coordinate transformation of the coordinate system in which the sub-target position is located according to the deviation angle of the sub-target position included in the displacement vector relative to the current orientation of the cleaning robot;
after the cleaning robot turns, the coordinate system where the sub-target positions are located is transformed along with the navigation coordinate system of the cleaning robot, so that the navigation coordinate system in the real-time map constructed by the cleaning robot is the same as the coordinate axis direction of the coordinate system where the sub-target positions are located.
12. The cleaning robot of claim 10, wherein the navigation message includes a wake-up command and a task command parsed from a voice command issued by a user;
the cleaning robot comprises a wake-up instruction, a navigation message and a control module, wherein the wake-up instruction is used for controlling the cleaning robot to enter a working state or suspend execution of a current cleaning task from a standby state, and then extracting the signal intensity of the navigation message and the coordinate matched with the ip address of the navigation message;
the task instruction is used for setting sub-target positions according to the signal intensity of the navigation message and the coordinates matched with the ip address of the navigation message, calculating the displacement vector and then controlling the cleaning robot to advance along a working path corresponding to the displacement vector;
the cleaning robot sends a response signal to a Bluetooth headset after receiving the navigation message every time, so that the Bluetooth headset adjusts an ip address of the navigation message broadcast and sent after one time interval, or the Bluetooth headset does not change the navigation message broadcast and sent after one time interval but the navigation message comprises different ip addresses;
the response signal carries signal strength information of the navigation message, and the signal strength information is associated with an ip address extracted from the navigation message by the cleaning robot each time.
13. A control system based on interconnection of bluetooth cleaning robots, characterized by comprising the bluetooth headset of any one of claims 7 to 9 and the cleaning robot of any one of claims 10 to 12.
CN201911137367.7A 2019-11-19 2019-11-19 Voice control method, Bluetooth headset, cleaning robot and control system Pending CN112890681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911137367.7A CN112890681A (en) 2019-11-19 2019-11-19 Voice control method, Bluetooth headset, cleaning robot and control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911137367.7A CN112890681A (en) 2019-11-19 2019-11-19 Voice control method, Bluetooth headset, cleaning robot and control system

Publications (1)

Publication Number Publication Date
CN112890681A true CN112890681A (en) 2021-06-04

Family

ID=76103799

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911137367.7A Pending CN112890681A (en) 2019-11-19 2019-11-19 Voice control method, Bluetooth headset, cleaning robot and control system

Country Status (1)

Country Link
CN (1) CN112890681A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113796778A (en) * 2021-08-03 2021-12-17 上海高仙自动化科技发展有限公司 Remote operation and maintenance method, device, system, robot, chip and storage medium
CN114391780A (en) * 2022-01-19 2022-04-26 深圳市无限动力发展有限公司 Sweeping control method and device of sweeper, computer equipment and storage medium
CN115019799A (en) * 2022-08-04 2022-09-06 广东工业大学 Man-machine interaction method and system based on long voice recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104535074A (en) * 2014-12-05 2015-04-22 惠州Tcl移动通信有限公司 Bluetooth earphone-based voice navigation method, system and terminal
CN108231069A (en) * 2017-08-30 2018-06-29 深圳乐动机器人有限公司 Sound control method, Cloud Server, clean robot and its storage medium of clean robot
CN108742332A (en) * 2018-05-30 2018-11-06 上海与德通讯技术有限公司 A kind of cleaning method and robot
CN108814449A (en) * 2018-07-30 2018-11-16 马鞍山问鼎网络科技有限公司 A kind of artificial intelligence sweeping robot control method based on phonetic order
CN108968828A (en) * 2018-09-07 2018-12-11 马鞍山问鼎网络科技有限公司 A kind of artificial intelligence sweeping robot control system
CN109272998A (en) * 2018-09-07 2019-01-25 马鞍山问鼎网络科技有限公司 A kind of artificial intelligent voice detection and control method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104535074A (en) * 2014-12-05 2015-04-22 惠州Tcl移动通信有限公司 Bluetooth earphone-based voice navigation method, system and terminal
CN108231069A (en) * 2017-08-30 2018-06-29 深圳乐动机器人有限公司 Sound control method, Cloud Server, clean robot and its storage medium of clean robot
CN108742332A (en) * 2018-05-30 2018-11-06 上海与德通讯技术有限公司 A kind of cleaning method and robot
CN108814449A (en) * 2018-07-30 2018-11-16 马鞍山问鼎网络科技有限公司 A kind of artificial intelligence sweeping robot control method based on phonetic order
CN108968828A (en) * 2018-09-07 2018-12-11 马鞍山问鼎网络科技有限公司 A kind of artificial intelligence sweeping robot control system
CN109272998A (en) * 2018-09-07 2019-01-25 马鞍山问鼎网络科技有限公司 A kind of artificial intelligent voice detection and control method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113796778A (en) * 2021-08-03 2021-12-17 上海高仙自动化科技发展有限公司 Remote operation and maintenance method, device, system, robot, chip and storage medium
CN113796778B (en) * 2021-08-03 2022-12-20 上海高仙自动化科技发展有限公司 Remote operation and maintenance method, device, system, robot, chip and storage medium
CN114391780A (en) * 2022-01-19 2022-04-26 深圳市无限动力发展有限公司 Sweeping control method and device of sweeper, computer equipment and storage medium
CN114391780B (en) * 2022-01-19 2023-03-21 深圳市无限动力发展有限公司 Sweeping control method and device of sweeper, computer equipment and storage medium
CN115019799A (en) * 2022-08-04 2022-09-06 广东工业大学 Man-machine interaction method and system based on long voice recognition

Similar Documents

Publication Publication Date Title
CN112890681A (en) Voice control method, Bluetooth headset, cleaning robot and control system
EP3460614B1 (en) Combined robot and cruising path generation method therefor
CN106909156B (en) Air purification method and device
JP2019525780A (en) Method, apparatus and readable storage medium for performing cleaning operation of cleaning device
WO2018053942A1 (en) Mobile robot and navigation method therefor
JP4839487B2 (en) Robot and task execution system
CN108628314B (en) Multi-machine cooperation lawn trimming robot system and method
EP3549430B1 (en) Emotion improvement device and emotion improvement method
US20070219667A1 (en) Home network system and method for an autonomous mobile robot to travel shortest path
EP1406139A3 (en) Portable programming terminal for robots or similar automatic apparatuses
US20160291596A1 (en) System and method for establishing virtual boundaries for robotic devices
JP2012137909A (en) Movable body remote control system and control program for the same
JP2010231470A (en) Information providing system
JP4608472B2 (en) Mobile robot and mobile robot controller
JP2012139798A (en) Mobile robot, learning system for the same, and method of learning action of the same
KR100696133B1 (en) Robot system for a cleaning
CN106527439A (en) Motion control method and apparatus
JP2011000656A (en) Guide robot
CN110677811B (en) Ad hoc network method of multiple mobile robots and method for determining respective working areas
JP5732633B2 (en) Communication robot
JP2006201829A (en) Self-propelled device and program therefor
JP4580226B2 (en) Robot control system
CN113359716B (en) Remote control method and device of Internet of things equipment, robot and storage medium
CN114527764A (en) Path walking method, system and terminal equipment
CN114243868A (en) Robot charging system and recharging control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong

Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd.

Address before: 519000 room 105-514, No.6 Baohua Road, Hengqin New District, Zhongshan City, Guangdong Province

Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210604