CN111390918A - Active control system of household intelligent robot - Google Patents
Active control system of household intelligent robot Download PDFInfo
- Publication number
- CN111390918A CN111390918A CN202010389089.0A CN202010389089A CN111390918A CN 111390918 A CN111390918 A CN 111390918A CN 202010389089 A CN202010389089 A CN 202010389089A CN 111390918 A CN111390918 A CN 111390918A
- Authority
- CN
- China
- Prior art keywords
- module
- image
- intelligent robot
- indoor
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Abstract
The invention discloses an active control system of a household robot, which comprises a command input module, a command identification module, a sensor module, a video acquisition module, an ultrasonic positioning module, a control module, a motion module, an information transmission module and a storage module, wherein the command input module is used for inputting a command; according to the method, the environment around the intelligent robot is detected through the ultrasonic positioning module, and then the distance and the range of ultrasonic detection are far larger than the range of video acquisition, the environment is compared with the positions of obstacles in the indoor plane image acquired in advance, the moving positions of the obstacles are found in time, and real-time indoor environment information is obtained.
Description
Technical Field
The invention belongs to the technical field of intelligent robots, and particularly relates to an active control system of a household intelligent robot.
Background
The rapid development of the internet drives social progress and improvement of living standard, and the application of intelligent household robots is more and more extensive, including educational robots, floor sweeping robots, carrying robots and the like, but because the urban home environment area is small, no matter which intelligent robot faces a problem in the actual use process, the home environment is complex and changeable, and the position of a barrier changes frequently, while in the prior art, the robot which is positioned by ultrasonic positioning and other technologies needs to adjust the advancing direction of the intelligent robot in real time according to the position of the barrier, but because the intelligent robot cannot understand the modified barrier distribution state, the intelligent robot easily loses direction and is difficult to position, the intelligent robot can have long-time meaningless advancing routes, and the working fluency of the intelligent robot is reduced, in order to solve the above problems, the present invention provides the following technical solutions.
Disclosure of Invention
The invention aims to provide an active control system of a household intelligent robot.
The technical problems to be solved by the invention are as follows:
the house environment is complicated changeable, the position change of barrier is comparatively frequent, and carry out the robot of fixing a position through techniques such as supersound location among the prior art, need be in real time according to the direction of travel of barrier position adjustment intelligent robot, but because intelligent robot has not solved the barrier distribution state after the modification, can lead to intelligent robot to lose the direction easily like this, be difficult to fix a position, long-time meaningless route of marcing can appear in intelligent robot, reduce the smoothness nature of intelligent robot work.
The purpose of the invention can be realized by the following technical scheme:
an active control system of a household robot comprises a command input module, a command recognition module, a sensor module, a video acquisition module, an ultrasonic positioning module, a control module, a motion module, an information transmission module and a storage module;
the instruction input module is used for collecting voice instructions, converting the collected voice instructions into electric signals and transmitting the electric signals to the instruction identification module;
the voice command recognition module is used for receiving the voice command information transmitted by the command input module, analyzing the voice command information to obtain corresponding voiceprint characteristic information, and comparing the obtained voiceprint characteristic information with the voiceprint characteristic information in the storage module, wherein the voice command information and the voiceprint characteristic information in the storage module are recorded and stored through the command input module and the command recognition module;
the sensor module comprises a sensor and an alarm device, wherein the sensor comprises a smoke sensor, a temperature sensor, a humidity sensor and a gas sensor, the smoke sensor is used for detecting indoor smoke concentration, the temperature sensor is used for detecting indoor temperature, the humidity sensor is used for detecting indoor humidity, the gas sensor is used for detecting indoor gas concentration, and when a detection value of the sensor reaches a preset threshold value, the alarm device gives an alarm;
the video acquisition module is used for acquiring video image information of the surrounding environment of the intelligent robot, transmitting the acquired video image information to the control module, transmitting the acquired video image information to the storage module for storage, and transmitting the video image information to the terminal equipment through the information transmission module;
the ultrasonic positioning module is used for acquiring object position information of the environment around the intelligent robot and transmitting the acquired object position information to the control module;
the control module is used for analyzing the video image information uploaded by the video acquisition module and the object position information uploaded by the ultrasonic positioning module and controlling the motion module to move.
As a further aspect of the present invention, a method for controlling a motion module to drive an intelligent robot to move by a control module includes:
s1, the video acquisition module acquires a plurality of indoor images in the motion process of the intelligent robot, a polygonal area is taken from one image as a reference point, the reference point is searched on other images, and the images are spliced to form an indoor plane image through the superposition of the reference points;
s2, marking the position of the obstacle through the indoor plane image, reading the position of the obstacle around the intelligent robot through the ultrasonic positioning module at intervals of preset time T in the working process of the intelligent robot, simultaneously acquiring the plane image of the position where the intelligent robot is located through the video acquisition module, comparing the acquired plane image with the indoor plane image formed by splicing, confirming the position where the intelligent robot is located, comparing the distribution of the obstacle around the ultrasonic reading with the distribution of the position where the obstacle is marked on the indoor plane image, finding the change position, driving the intelligent robot to go to the corresponding position by the motion module to acquire the plane image of the change position of the obstacle, and replacing the image of the corresponding area in the indoor plane image;
and S3, destination information is input into the control module through the instruction input module, the control module designs a feasible shortest path according to the position of the intelligent robot, the destination position and the condition of the obstacles between the intelligent robot and the destination position, and the intelligent robot is driven to move according to the designed path through the movement module.
As a further scheme of the invention, the indoor plane image acquired by the video acquisition module is an image shot from top to bottom; and when the video acquisition module is used for acquiring indoor images, the height and the angle of video acquisition are fixed.
As a further aspect of the present invention, in step S1, before stitching several images together to form an indoor plane image, the image of the intelligent robot on each image is peeled off, and a region within n mm of an edge of the peeled-off blank region in the image is used as a study region, a color distribution in the study region is read, when a color in the study region is a single color, the peeled-off blank region is filled with the color in the study region, when the color in the study region is two colors, the two colors are connected at two junctions in the study region, and the peeled-off blank region in the image is divided into two parts and filled with the colors correspondingly.
As a further scheme of the present invention, the method for acquiring the planar image in step S2 is that the video acquisition module acquires an image from top to bottom, and when the video acquisition module acquires an indoor image, the height and angle of the video acquisition are fixed, and both the height and angle are equal to the height and angle of the indoor planar image acquired in step S1.
The invention has the beneficial effects that:
because the distance and the range of ultrasonic detection are far greater than the range of video acquisition, the invention firstly detects the environment around the intelligent robot through the ultrasonic positioning module, then compares the environment with the positions of all barriers in the indoor plane image acquired in advance, finds the moving positions of the barriers in time and obtains real-time indoor environment information.
Drawings
The invention is described in further detail below with reference to the figures and specific embodiments.
Fig. 1 is a schematic system structure diagram of an active control system of a home robot according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An active control system of a household robot is shown in figure 1 and comprises a command input module, a command recognition module, a sensor module, a video acquisition module, an ultrasonic positioning module, a control module, a motion module, an information transmission module and a storage module;
the instruction input module is used for collecting voice instructions, converting the collected voice instructions into electric signals and transmitting the electric signals to the instruction identification module;
the voice command recognition module is used for receiving the voice command information transmitted by the command input module, analyzing the voice command information to obtain corresponding voiceprint characteristic information, and comparing the obtained voiceprint characteristic information with the voiceprint characteristic information in the storage module, wherein the voice command information and the voiceprint characteristic information in the storage module are recorded and stored through the command input module and the command recognition module;
therefore, the situation that people except family members control the action of the robot can be avoided, and the safety of the household intelligent robot is improved;
the sensor module comprises a sensor and an alarm device, wherein the sensor comprises a smoke sensor, a temperature sensor, a humidity sensor and a gas sensor, the smoke sensor is used for detecting indoor smoke concentration, the temperature sensor is used for detecting indoor temperature, the humidity sensor is used for detecting indoor humidity, the gas sensor is used for detecting indoor gas concentration, and when a detection value of the sensor reaches a preset threshold value, the alarm device gives an alarm;
the video acquisition module is used for acquiring video image information of the surrounding environment of the intelligent robot, transmitting the acquired video image information to the control module, transmitting the acquired video image information to the storage module for storage, and transmitting the video image information to the terminal equipment through the information transmission module;
the ultrasonic positioning module is used for acquiring object position information of the environment around the intelligent robot and transmitting the acquired object position information to the control module;
the control module is used for analyzing the video image information uploaded by the video acquisition module and the object position information uploaded by the ultrasonic positioning module and controlling the motion module to move;
the method for controlling the motion module to drive the intelligent robot to move by the control module comprises the following steps:
s1, the video acquisition module acquires a plurality of indoor images in the motion process of the intelligent robot, a polygonal area is taken from one image as a reference point, the reference point is searched on other images, and the images are spliced to form an indoor plane image through the superposition of the reference points;
the indoor plane image collected by the video collection module is an image shot from top to bottom; when the video acquisition module acquires an indoor image, the height and the angle of video acquisition are fixed; before a plurality of images are spliced to form an indoor plane image in step 1, the image of the intelligent robot on each image is peeled off, an area within n millimeters of the edge of the peeled blank area in the image is used as a research area, color distribution in the research area is read, when the color in the research area is a single color, the peeled blank area is filled with the color in the research area, when the color in the research area is two colors, the two colors are connected at two junctions in the research area, the peeled blank area in the image is divided into two parts, and the colors are correspondingly filled, and as the home decoration color is single and clear, the actual life needs can be met;
the method can improve the regularity of the indoor plane image, and the indoor plane image conforms to the actual indoor plane condition as much as possible;
s2, marking the position of the obstacle through the indoor plane image, reading the position of the obstacle around the intelligent robot through the ultrasonic positioning module at intervals of preset time T in the working process of the intelligent robot, simultaneously acquiring the plane image of the position where the intelligent robot is located through the video acquisition module, comparing the acquired plane image with the indoor plane image formed by splicing, confirming the position where the intelligent robot is located, comparing the distribution of the obstacle around the ultrasonic reading with the distribution of the position where the obstacle is marked on the indoor plane image, finding the change position, driving the intelligent robot to go to the corresponding position by the motion module to acquire the plane image of the change position of the obstacle, and replacing the image of the corresponding area in the indoor plane image;
the method for acquiring the planar image in the step S2 is that the video acquisition module captures images from top to bottom, and when the video acquisition module acquires an indoor image, the height and angle of the video acquisition are fixed, and both the height and angle are equal to those of the indoor planar image acquired in the step S1.
And S3, destination information is input into the control module through the instruction input module, the control module designs a feasible shortest path according to the position of the intelligent robot, the destination position and the condition of the obstacles between the intelligent robot and the destination position, and the intelligent robot is driven to move according to the designed path through the movement module.
Because the distance and the range of ultrasonic detection are far greater than the range of video acquisition, the invention firstly detects the environment around the intelligent robot through the ultrasonic positioning module, then compares the environment with the positions of all barriers in the indoor plane image acquired in advance, finds the moving positions of the barriers in time and obtains real-time indoor environment information.
The foregoing is merely exemplary and illustrative of the present invention and various modifications, additions and substitutions may be made by those skilled in the art to the specific embodiments described without departing from the scope of the invention as defined in the following claims.
Claims (5)
1. An active control system of a household robot is characterized by comprising a command input module, a command recognition module, a sensor module, a video acquisition module, an ultrasonic positioning module, a control module, a motion module, an information transmission module and a storage module;
the instruction input module is used for collecting voice instructions, converting the collected voice instructions into electric signals and transmitting the electric signals to the instruction identification module;
the voice command recognition module is used for receiving the voice command information transmitted by the command input module, analyzing the voice command information to obtain corresponding voiceprint characteristic information, and comparing the obtained voiceprint characteristic information with the voiceprint characteristic information in the storage module, wherein the voice command information and the voiceprint characteristic information in the storage module are recorded and stored through the command input module and the command recognition module;
the sensor module comprises a sensor and an alarm device, wherein the sensor comprises a smoke sensor, a temperature sensor, a humidity sensor and a gas sensor, the smoke sensor is used for detecting indoor smoke concentration, the temperature sensor is used for detecting indoor temperature, the humidity sensor is used for detecting indoor humidity, the gas sensor is used for detecting indoor gas concentration, and when a detection value of the sensor reaches a preset threshold value, the alarm device gives an alarm;
the video acquisition module is used for acquiring video image information of the surrounding environment of the intelligent robot, transmitting the acquired video image information to the control module, transmitting the acquired video image information to the storage module for storage, and transmitting the video image information to the terminal equipment through the information transmission module;
the ultrasonic positioning module is used for acquiring object position information of the environment around the intelligent robot and transmitting the acquired object position information to the control module;
the control module is used for analyzing the video image information uploaded by the video acquisition module and the object position information uploaded by the ultrasonic positioning module and controlling the motion module to move.
2. The active control system for the household robot as claimed in claim 1, wherein the method for the control module to control the motion module to drive the intelligent robot to move is as follows:
s1, the video acquisition module acquires a plurality of indoor images in the motion process of the intelligent robot, a polygonal area is taken from one image as a reference point, the reference point is searched on other images, and the images are spliced to form an indoor plane image through the superposition of the reference points;
s2, marking the position of the obstacle through the indoor plane image, reading the position of the obstacle around the intelligent robot through the ultrasonic positioning module at intervals of preset time T in the working process of the intelligent robot, simultaneously acquiring the plane image of the position where the intelligent robot is located through the video acquisition module, comparing the acquired plane image with the indoor plane image formed by splicing, confirming the position where the intelligent robot is located, comparing the distribution of the obstacle around the ultrasonic reading with the distribution of the position where the obstacle is marked on the indoor plane image, finding the change position, driving the intelligent robot to go to the corresponding position by the motion module to acquire the plane image of the change position of the obstacle, and replacing the image of the corresponding area in the indoor plane image;
and S3, destination information is input into the control module through the instruction input module, the control module designs a feasible shortest path according to the position of the intelligent robot, the destination position and the condition of the obstacles between the intelligent robot and the destination position, and the intelligent robot is driven to move according to the designed path through the movement module.
3. The active control system of claim 2, wherein the indoor plane image captured by the video capture module is an image taken from top to bottom, and the height and angle of the video capture module are fixed when capturing the indoor image.
4. The active control system of a home robot as claimed in claim 2, wherein in step S1, before the images are combined to form the indoor plane image, the image of the smart robot on each image is peeled off, and the area within n mm of the edge of the peeled blank area in the image is used as the research area, the color distribution in the research area is read, when the color in the research area is a single color, the peeled blank area is filled with the color in the research area, when the color in the research area is two colors, the two colors are connected at two junctions in the research area, the peeled blank area in the image is divided into two parts and filled with the colors correspondingly.
5. The active control system of claim 2, wherein the planar image is captured from top to bottom by the video capture module in step S2, and the height and angle of the video capture module are fixed when capturing the indoor image, and both are equal to the height and angle of the indoor planar image captured in step S1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010389089.0A CN111390918B (en) | 2020-05-09 | 2020-05-09 | Active control system of household intelligent robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010389089.0A CN111390918B (en) | 2020-05-09 | 2020-05-09 | Active control system of household intelligent robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111390918A true CN111390918A (en) | 2020-07-10 |
CN111390918B CN111390918B (en) | 2021-10-08 |
Family
ID=71426211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010389089.0A Active CN111390918B (en) | 2020-05-09 | 2020-05-09 | Active control system of household intelligent robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111390918B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117226843A (en) * | 2023-09-27 | 2023-12-15 | 盐城工学院 | Robot movement track control method and system based on visual servo |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102914303A (en) * | 2012-10-11 | 2013-02-06 | 江苏科技大学 | Navigation information acquisition method and intelligent space system with multiple mobile robots |
CN105334858A (en) * | 2015-11-26 | 2016-02-17 | 江苏美的清洁电器股份有限公司 | Floor sweeping robot and indoor map establishing method and device thereof |
CN107085422A (en) * | 2017-01-04 | 2017-08-22 | 北京航空航天大学 | A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment |
CN108335458A (en) * | 2018-03-05 | 2018-07-27 | 李孟星 | It is a kind of to see that the domestic intelligent of people sees guard system and its keeps an eye on method |
US10611028B1 (en) * | 2018-11-30 | 2020-04-07 | NextVPU (Shanghai) Co., Ltd. | Map building and positioning of robot |
-
2020
- 2020-05-09 CN CN202010389089.0A patent/CN111390918B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102914303A (en) * | 2012-10-11 | 2013-02-06 | 江苏科技大学 | Navigation information acquisition method and intelligent space system with multiple mobile robots |
CN105334858A (en) * | 2015-11-26 | 2016-02-17 | 江苏美的清洁电器股份有限公司 | Floor sweeping robot and indoor map establishing method and device thereof |
CN107085422A (en) * | 2017-01-04 | 2017-08-22 | 北京航空航天大学 | A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment |
CN108335458A (en) * | 2018-03-05 | 2018-07-27 | 李孟星 | It is a kind of to see that the domestic intelligent of people sees guard system and its keeps an eye on method |
US10611028B1 (en) * | 2018-11-30 | 2020-04-07 | NextVPU (Shanghai) Co., Ltd. | Map building and positioning of robot |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117226843A (en) * | 2023-09-27 | 2023-12-15 | 盐城工学院 | Robot movement track control method and system based on visual servo |
CN117226843B (en) * | 2023-09-27 | 2024-02-27 | 盐城工学院 | Robot movement track control method and system based on visual servo |
Also Published As
Publication number | Publication date |
---|---|
CN111390918B (en) | 2021-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Xiong et al. | An autonomous strawberry‐harvesting robot: Design, development, integration, and field evaluation | |
CN109514582B (en) | Pet teasing control device for robot and mobile robot | |
CN107139182B (en) | A kind of citrus picking robot system and its control method | |
AU2022201137B2 (en) | System and method for monitoring a property using drone beacons | |
CN100360204C (en) | Control system of intelligent perform robot based on multi-processor cooperation | |
CN112152129B (en) | Intelligent safety management and control method and system for transformer substation | |
CN112414457B (en) | Automatic intelligent inspection method based on transformer substation work | |
CN106020201A (en) | Mobile robot 3D navigation and positioning system and navigation and positioning method | |
CN102323817A (en) | Service robot control platform system and multimode intelligent interaction and intelligent behavior realizing method thereof | |
CN102902271A (en) | Binocular vision-based robot target identifying and gripping system and method | |
CN113128473A (en) | Underground comprehensive pipe gallery-oriented inspection system, method, equipment and storage medium | |
CN103064416A (en) | Indoor and outdoor autonomous navigation system for inspection robot | |
CN105578058A (en) | Shooting control method and device for intelligent robot and robot | |
Tian et al. | RGB-D based cognitive map building and navigation | |
CN111390918B (en) | Active control system of household intelligent robot | |
CN107234625A (en) | The method that visual servo is positioned and captured | |
CN111243120A (en) | Environment inspection system based on big data | |
WO2023024499A1 (en) | Robot control method, control apparatus, robot, and readable storage medium | |
CN106774324B (en) | Two cameras three-dimensional identification patrol robot | |
CN108748165A (en) | A kind of artificial intelligence robot of autonomous classification anticollision | |
CN114237225A (en) | Quadruped robot and intelligent inspection management platform thereof | |
CN107309883A (en) | Intelligent robot | |
CN112412242A (en) | Automatic door control and anti-pinch system based on binocular stereoscopic vision and method thereof | |
CN203070098U (en) | Indoor and outdoor autonomous navigation system for patrol robot | |
CN113084776B (en) | Intelligent epidemic prevention robot and system based on vision and multi-sensor fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |