CN116919247A - Welt identification method, device, computer equipment and medium - Google Patents
Welt identification method, device, computer equipment and medium Download PDFInfo
- Publication number
- CN116919247A CN116919247A CN202310724714.6A CN202310724714A CN116919247A CN 116919247 A CN116919247 A CN 116919247A CN 202310724714 A CN202310724714 A CN 202310724714A CN 116919247 A CN116919247 A CN 116919247A
- Authority
- CN
- China
- Prior art keywords
- welt
- line laser
- distance
- data
- longitudinal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 75
- 238000004590 computer program Methods 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 8
- 238000007781 pre-processing Methods 0.000 claims description 8
- 230000009467 reduction Effects 0.000 claims description 6
- 230000002159 abnormal effect Effects 0.000 claims description 5
- 238000004140 cleaning Methods 0.000 abstract description 19
- 238000001514 detection method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 208000035473 Communicable disease Diseases 0.000 description 1
- 241001417527 Pempheridae Species 0.000 description 1
- 230000002547 anomalous effect Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000007795 chemical reaction product Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/28—Floor-scrubbing machines, motor-driven
- A47L11/282—Floor-scrubbing machines, motor-driven having rotary tools
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4061—Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The application relates to the field of intelligent robots, in particular to a welt identification method, a welt identification device, computer equipment and a storage medium, which are applied to a commercial robot, wherein the method comprises the following steps: scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data; scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data; determining whether to execute obstacle avoidance operation according to the transverse scanning data; if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data; and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance. By the method, the cleaning coverage rate of the commercial cleaning robot is improved, and intelligent welt cleaning is realized.
Description
Technical Field
The present application relates to the field of intelligent robots, and in particular, to a method and apparatus for edge bonding recognition, a computer device, and a storage medium.
Background
At present, in commercial robot, can realize that the clear technical scheme in corner is mostly swept to the limit of machine fuselage external extension, sweep the dust rubbish of corner in the clean subassembly in the machine chassis through sweeping the limit simultaneously to realize that the corner is clean promotes clean coverage rate, but in cleaning the in-process, the outside limit of extending sweeps the subassembly and bumps easily, commercial robot weight and volume are big in addition, use the outside limit of extending to sweep the subassembly, clean coverage rate greatly reduced, therefore how to improve commercial cleaning robot's clean coverage rate and more intelligently realize the problem of rubberizing and clean need to be solved.
Disclosure of Invention
The application mainly aims to provide a welt identification method, a device, computer equipment and a storage medium, aiming at improving the cleaning coverage rate of a commercial cleaning robot and realizing the welt cleaning more intelligently.
In order to achieve the above object, the present application provides a welt recognition method, including:
a welt identification method applied to a commercial robot, the method comprising:
scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data;
Scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data;
determining whether to execute obstacle avoidance operation according to the transverse scanning data;
if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data;
and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance.
Further, the scanning the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data includes:
transmitting first transverse line laser to a first triangular area in front based on a first transverse line laser radar to obtain first triangular line laser scanning data;
transmitting second transverse line laser to a second triangular region on the basis of a second transverse line laser radar to obtain second triangular line laser scanning data, wherein an intersection region exists between the first triangular region and the second triangular region;
and obtaining transverse scanning data according to the first triangular line laser scanning data and the second triangular line laser scanning data.
Further, the scanning the lateral area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scanning data includes:
transmitting third longitudinal line laser to the left side area at a first angle according to the third longitudinal line laser radar to obtain third longitudinal line laser data;
transmitting fourth longitudinal line laser to the right side area at a second angle according to the fourth longitudinal line laser radar to obtain fourth longitudinal line laser data;
and obtaining longitudinal scanning data according to the third longitudinal line laser data and the fourth longitudinal line laser data.
Further, if not, acquiring the welt distance of the commercial robot according to the longitudinal scan data includes:
acquiring a welt path of the commercial robot;
judging whether the direction of the welt path is anticlockwise or not;
when the welt path direction is anticlockwise, obtaining the welt distance of the commercial robot according to fourth longitudinal line laser data;
and when the welt path direction is clockwise, obtaining the welt distance of the commercial robot according to the third longitudinal line laser data.
Further, when the welt path direction is counterclockwise, obtaining the welt distance of the commercial robot according to a fourth longitudinal line laser includes:
Acquiring line laser point cloud data according to the fourth longitudinal line laser;
performing straight line fitting on the line laser point cloud data to obtain a plurality of fitted straight line segments;
and respectively acquiring first distances between the commercial robot and the fitting straight line segments, selecting the smallest first distance among all the first distances, and taking the smallest first distance as the welting distance.
Further, the controlling the commercial robot to continue the welt running further includes:
acquiring the running environment information of the commercial robot in real time according to a plane laser radar or a camera module;
acquiring side area information and front area information of the commercial robot according to the environment information;
and sending corresponding control instructions according to the side area information and the front area information to control the commercial robot to perform welt running according to a preset movement mode.
Further, the real-time acquisition of the robot driving environment information according to the planar laser radar or the camera module includes:
controlling the commercial robot to emit plane laser radar waves to obtain plane laser point cloud data;
performing data preprocessing on the point cloud data, wherein the preprocessing comprises noise filtering and abnormal value filtering;
Performing least square fitting on the preprocessed point cloud data to obtain a linear relation of the point cloud data;
and performing dimension reduction projection according to the linear relation of the point cloud data to obtain environment information.
A welt identification device, the device comprising:
the transverse scanning data module is used for scanning the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data;
the longitudinal scanning data module is used for scanning the side area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scanning data;
the obstacle avoidance module is used for determining whether to execute an obstacle avoidance operation according to the transverse scanning data;
the welt distance module is used for acquiring the welt distance of the commercial robot according to the longitudinal scanning data if the welt distance module is not executed;
and the welt running module is used for comparing the minimum welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not larger than the preset welt distance.
The application also provides a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the steps of any of the above-mentioned welt identification methods when executing the computer program.
The present application also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of any of the above-described welt identification methods.
The application relates to the field of intelligent robots, in particular to a welt identification method, a welt identification device, computer equipment and a storage medium, which are applied to a commercial robot, wherein the method comprises the following steps: scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data; scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data; detecting whether an obstacle exists in a peripheral area of the position of the commercial robot according to the transverse line laser radar, and if so, executing a triggering obstacle avoidance strategy by the commercial robot; if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data; obtaining longitudinal fitting of the side obstacle through longitudinal scanning data, obtaining the welt distance between the commercial robot and the side obstacle, comparing the welt distance with a preset welt distance, and realizing the control of the welt distance of the commercial robot when the welt distance is not greater than the preset welt distance; and controlling the commercial robot to continue to keep the welting running. By the method, the cleaning coverage rate of the commercial cleaning robot is improved, and intelligent welt cleaning is realized.
Drawings
FIG. 1 is a flowchart of a welt recognition method according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a welt recognition method according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a welt recognition method according to an embodiment of the present application;
FIG. 4 is a flowchart illustrating a welt identification method according to an embodiment of the present application;
FIG. 5 is a flowchart illustrating a welt identification method according to an embodiment of the present application;
FIG. 6 is a flowchart illustrating a welt identification method according to an embodiment of the present application;
FIG. 7 is a flowchart illustrating a welt identification method according to an embodiment of the present application;
FIG. 8 is a flowchart illustrating a welt identification method according to an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating a structure of an edge bonding recognition device according to an embodiment of the present application;
FIG. 10 is a block diagram schematically illustrating the structure of a computer device according to an embodiment of the present application.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Referring to fig. 1, an embodiment of the present application provides a welt identification method, including steps S10 to S50, and the following details of each step of the welt identification method are set forth below.
S10, scanning a front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data;
in this embodiment, the scanning the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data includes: transmitting first transverse line laser to a first triangular area in front based on a first transverse line laser radar to obtain first triangular line laser scanning data; transmitting second transverse line laser to a second triangular region on the basis of a second transverse line laser radar to obtain second triangular line laser scanning data, wherein an intersection region exists between the first triangular region and the second triangular region; and obtaining transverse scanning data according to the first triangular line laser scanning data and the second triangular line laser scanning data. Specifically, in an embodiment, a first transverse line laser and a second transverse line laser are installed in front of a machine body of a commercial robot, and the robot scans an oblique front area of the robot and an area facing a ground direction obliquely downwards through the first and second transverse line laser radars in a driving process, and generates an intersection area when the first transverse line laser and the second transverse line laser radars scan obliquely forwards, so that scanning data of the front of the robot are obtained, front area missing scanning and left and right oblique front scanning data are avoided, and the transverse scanning data are obtained through the technical scheme. The area facing the ground in the downward direction emits line laser, and an obstacle lower than the installation position of the transverse line laser radar at the accurate equipment can be obtained.
S20, scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data;
in this embodiment, the scanning the lateral area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scan data includes: transmitting third longitudinal line laser to the left side area at a first angle according to the third longitudinal line laser radar to obtain third longitudinal line laser data; transmitting fourth longitudinal line laser to the right side area at a second angle according to the fourth longitudinal line laser radar to obtain fourth longitudinal line laser data; and obtaining longitudinal scanning data according to the third longitudinal line laser data and the fourth longitudinal line laser data. Specifically, in one embodiment, a solid-state line laser sensor is installed at a height of the robot relatively close to the ground, the solid-state line laser sensor has 4 lines of laser, and the first line laser radar and the second line laser radar incline towards the ground by a certain angle to perform transverse detection on low obstacles on the ground, such as detection of the obstacle in a detection range, and trigger an obstacle avoidance strategy; the third line laser radar and the fourth line laser radar perform longitudinal detection towards the ground and the side face of the machine body, through longitudinal fitting of side obstacles, the angles of the transverse detection and the longitudinal detection can be set according to requirements, and longitudinal scanning data are obtained through the mode.
S30, determining whether to execute obstacle avoidance operation according to the transverse scanning data;
in this embodiment, determining whether to perform the obstacle avoidance operation according to the lateral scan data includes: determining whether an obstacle exists according to the transverse scanning data; if yes, executing obstacle avoidance operation; and if the data do not exist, acquiring the welt distance of the commercial robot according to the longitudinal scanning data. Specifically, in an embodiment, the first direction scan data is data of transverse detection of the robot, the second direction scan data is data of longitudinal detection of the robot, the second direction scan data is fitted by using a RANSAC algorithm to obtain fitted data, that is, an obstacle with a distance different from that of the commercial robot is fitted, the distance between the commercial robot and the different obstacle is further judged according to the fitted data, so as to obtain the edge-attaching distance between the commercial robot and the nearest obstacle, the basic assumption of the RANSAC algorithm is that a sample contains correct data (including data which can be described by a model) and also contains abnormal data (including data which deviate from a normal range far and cannot adapt to a mathematical model), that is, the data set contains noise. These anomalous data may be due to erroneous measurements, erroneous assumptions, erroneous calculations, etc. By the mode, the environment state of the terminal is determined.
S40, if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data;
in the embodiment, if not, two solid-state line lasers installed at the side of the machine body are longitudinally detected towards the ground and the side of the machine body, and the control of the distance of the welt is realized through the longitudinal fitting of the side obstacle; if not, before the step of acquiring the welt distance of the commercial robot according to the longitudinal scan data, the method includes: and determining whether an obstacle exists according to the longitudinal scanning data, detecting the object through a longitudinal line laser radar wave emitted by the robot, then receiving a radar echo reflected back from the object, and calculating the distance between the detected object and the sweeper through the round-trip time of the radar wave. The main working process of the detection distance is that firstly, the line laser radar emits a plurality of laser beams, the laser beams are reflected back after being hit by an obstacle and are received and processed by the laser receiving system so as to receive the time between the emission and reflection of the laser and the time of receiving, namely the time of flying the laser. From the time of flight, the distance of the obstacle can be calculated. Laser ranging methods can be classified into two types according to different forms of emitted laser signals: pulse method laser ranging and phase method laser ranging. (1) pulse method laser ranging: the pulse method is that after the laser radar emits a laser beam, a part of the laser is reflected back to the obstacle and received by a receiver of the laser radar. Meanwhile, the time interval between transmission and reception may be recorded in the lidar, and the distance to be measured may be calculated from the speed of light. (2) laser ranging by a phase method: the phase method is a continuous laser signal intensity modulated by a laser transmitter. After being illuminated by the obstacle, it is reflected back. The measuring beam will produce a phase change in the round trip. By calculating the phase difference between the laser signal in the radar and the object that the obstacle flies back and forth, the distance of the obstacle is converted. According to the above mode, the welt distance of the commercial robot is obtained.
S50, comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not larger than the preset welt distance.
In this embodiment, the controlling the commercial robot to continue to keep the welt running further includes: acquiring the running environment information of the commercial robot in real time according to a plane laser radar or a camera module; acquiring side area information and front area information of the commercial robot according to the environment information; and sending corresponding control instructions according to the side area information and the front area information to control the commercial robot to perform welt running according to a preset movement mode. Specifically, in an embodiment, the welt distance of the commercial robot is obtained according to the longitudinal scan data, in this embodiment, the welt distance is 30 cm, the preset welt distance is 2 cm, the size of the welt distance and the preset welt distance is compared, at this time, the commercial robot detects that the welt distance is far greater than the preset welt distance by 2 cm, the navigation terminal of the commercial robot controls the machine to perform the welt navigation running, the welt running is stopped until the commercial robot detects that the welt distance is less than or equal to the preset welt distance, when the welt distance is not greater than the preset welt distance, the commercial robot is controlled to continue to keep the welt running, otherwise, the robot plans the welt path according to the machine planning terminal, the robot issues the welt path to the navigation control terminal, the navigation control terminal controls the machine to perform the welt running, the welt distance is guided to meet the preset distance, and the robot is controlled to perform the welt running in the above manner.
In one embodiment, the scanning the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data includes:
transmitting first transverse line laser to a first triangular area in front based on a first transverse line laser radar to obtain first triangular line laser scanning data;
transmitting second transverse line laser to a second triangular region on the basis of a second transverse line laser radar to obtain second triangular line laser scanning data, wherein an intersection region exists between the first triangular region and the second triangular region;
and obtaining transverse scanning data according to the first triangular line laser scanning data and the second triangular line laser scanning data.
In this embodiment, two transversal line laser radars are installed at two sides in front of a body of a commercial robot, and are represented as a first transversal line laser radar and a second transversal line laser radar, and according to the first transversal line laser radar and the second transversal line laser radar, the transversal scanning data in front of the robot are obtained, specifically, in an embodiment, a solid line laser sensor is installed on the inclined ground of the robot at a certain angle, and the transversal detection is performed on a low obstacle on the ground through the solid line laser sensor.
In one embodiment, the scanning the lateral area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scanning data includes:
transmitting third longitudinal line laser to the left side area at a first angle according to the third longitudinal line laser radar to obtain third longitudinal line laser data;
transmitting fourth longitudinal line laser to the right side area at a second angle according to the fourth longitudinal line laser radar to obtain fourth longitudinal line laser data;
and obtaining longitudinal scanning data according to the third longitudinal line laser data and the fourth longitudinal line laser data.
In this embodiment, the regions on two sides of the commercial robot are scanned according to the longitudinal line laser radar, the third longitudinal line laser emits the third longitudinal line laser to the left region at the first angle to obtain third longitudinal line laser data, the first angle can be adjusted in the system according to the needs of the user, the third longitudinal line laser data and the fourth longitudinal line laser data respectively include data such as a distance value of the commercial robot from an obstacle, map information and the like, specifically, in one embodiment, the commercial robot is located at a first position, the third longitudinal line laser radar and the fourth longitudinal line laser radar are respectively installed on the body side of the commercial robot, and the inclined ground is at an angle of 45 degrees, in another embodiment, the oblique front is scanned when the commercial robot travels forward, so that side collision is prevented from occurring in the traveling process, and the commercial robot is prevented from collision with the human being caused by the collision.
In one embodiment, the acquiring the welt distance of the commercial robot according to the longitudinal scan data if not performed includes:
acquiring a welt path of the commercial robot;
judging whether the direction of the welt path is anticlockwise or not;
when the welt path direction is anticlockwise, obtaining the welt distance of the commercial robot according to fourth longitudinal line laser data;
and when the welt path direction is clockwise, obtaining the welt distance of the commercial robot according to the third longitudinal line laser data.
In this embodiment, the welt distance of the commercial robot is obtained according to the path direction of the welt path. Specifically, in an embodiment, two line laser radars are installed on the left and right sides of the commercial robot, and the two line laser radars are represented by a third longitudinal line laser and a fourth longitudinal line laser, wherein the third longitudinal line laser is used for acquiring barrier information of a left area of the commercial robot, and the fourth longitudinal line laser is used for acquiring map information of a right area of the commercial robot, and judging whether the direction of the welt path is a counterclockwise direction or not; and when the welt path direction is anticlockwise, the fourth longitudinal line laser data acquire the welt distance of the commercial robot, and when the welt path direction is clockwise, the welt distance of the commercial robot is acquired according to the third longitudinal line laser data, and finally the robot performs welt running through a navigation system, and the welt distance of the commercial robot is acquired according to the longitudinal line laser. Specifically, in an embodiment, in a driving process of the commercial robot, the solid line laser is installed on the left and right sides of the commercial robot, the solid line laser is emitted to the left side to detect whether an obstacle exists in a left side area of the machine body, the solid line laser is emitted to the right side to detect whether an obstacle exists in a right side area of the machine body, when the commercial robot performs the welt driving at the first position, the fourth longitudinal line laser performs the welt distance detection operation when the welt path direction is in the anticlockwise direction, and is used for controlling the welt distance of the commercial robot, acquiring the welt distance operation of the commercial robot in real time, and preventing the commercial robot from being separated from the welt, at this time, when the third longitudinal line laser detects whether an obstacle exists in the left side area of the machine body, such as a pedestrian, collision between the pedestrian and the commercial robot is avoided, and when the commercial robot detects the obstacle, the commercial robot navigation system performs the obstacle avoidance operation, and bypasses the obstacle driving. According to the above technical scheme, the welt distance of the commercial robot is obtained, and the navigation system of the robot comprises: 1. inertial navigation: cleaning work is carried out according to a default route, no global view exists through collision transfer direction, the cleaning work is easy to turn in one place under a slightly complex environment, a whole house map cannot be generated, and a virtual wall cannot be set; the full house map can be generated only by adding the traditional ultrasonic or infrared detection technology, but the technology cannot be compared with laser navigation. 2. Visual navigation: the method has the defects that the scanning is inaccurate in places with weak light and obstacles (such as glass) which cannot be detected by the light; however, if an AI recognition function is added thereto, the level of intelligence thereof is increased, and products having different degrees of intelligence are presented because such a technique is widely different among different brands. 3. Laser navigation: 1) LDS laser navigation: the surrounding environment is detected by laser, the influence of a light source is avoided, the scanning range is larger, and the generated map is more accurate; 2) dTof laser navigation: the method is initially applied to the aerospace technology, and compared with the LDS laser navigation, the method has the advantages that the scanning range is larger, the accuracy of the detected object is improved by 4 times, and the method is a technology which can be configured in high-end products.
In one embodiment, when the welt path direction is counterclockwise, obtaining the welt distance of the commercial robot according to a fourth longitudinal line laser includes:
acquiring line laser point cloud data according to the fourth longitudinal line laser;
performing straight line fitting on the line laser point cloud data to obtain a plurality of fitted straight line segments;
and respectively acquiring first distances between the commercial robot and the fitting straight line segments, selecting the smallest first distance among all the first distances, and taking the smallest first distance as the welting distance.
In the embodiment, performing straight line fitting on fourth line laser radar point cloud data, and acquiring the minimum distance between the robot and the obstacle according to the fitted straight line; the moving speed of the robot is adjusted according to the minimum distance, specifically, in an embodiment, the commercial robot emits solid line laser, fourth longitudinal line laser collects point cloud data and returns the point cloud data to the navigation control end, straight line segment fitting is performed on the point cloud data according to the techniques of least square method, least square method weighting, RANSAC and the like to obtain an obstacle type, in the embodiment, referring to fig. 8, the fourth longitudinal line laser scans the right side area of the main body of the commercial robot, and detects the right side obstacle,
The method comprises the steps that fourth longitudinal line laser collects obstacle point cloud data, four straight line segments, namely a wall or an obstacle, are fitted to the point cloud data, F in the figure is a laser beam emitted by a third longitudinal line laser radar, G is a laser beam emitted by the fourth longitudinal line laser radar, E is an obstacle or a wall, whether the fitted straight line segments A, B, C and D have transverse line segments or not is judged, if yes, the transverse line segments are filtered, in the embodiment, the fact that the straight line segments A and C are the transverse straight line segments is detected, the distances between a side body of the commercial robot and the fitted straight line segments are obtained, namely the distances from the side body of the commercial robot to the B and the D are obtained, a minimum distance value is selected, the minimum first distance is taken as an edge pasting distance, and a navigation control end adjusts the speed feedback of the next two driving wheels by monitoring the nearest obstacle distances in a real-time high-frequency mode; by the mode, the moving speed of the robot is adjusted.
In one embodiment, the controlling the commercial robot to continue the welt driving further includes:
Acquiring the running environment information of the commercial robot in real time according to a plane laser radar or a camera module;
acquiring side area information and front area information of the commercial robot according to the environment information;
and sending corresponding control instructions according to the side area information and the front area information to control the commercial robot to perform welt running according to a preset movement mode.
In this embodiment, the information on the driving environment of the commercial robot includes acquiring distance values from the front of the robot to a wall or other obstacles and distance values from the left and right sides of the commercial robot to the wall or nearest obstacles according to a planar laser radar or a camera module; specifically, in an embodiment, a planar laser radar or a camera module is installed at a position of the commercial robot, which is fifty centimeters away from the ground, running environment information of the commercial robot is acquired through the planar laser radar transmitting radar and the camera module, the camera module consists of a sensor, a lens and component interfaces supporting various data interfaces, the running environment information comprises lateral area information and front area information of the commercial robot, and finally, corresponding control instructions are automatically sent according to the identified area information to enable the robot to perform welting movement according to a set movement mode, so that the robot in the embodiment does not need to manually construct a diagram or acquire priori data in advance during the welting movement, the early manual intervention degree is greatly reduced, the stability of a system is improved, and the labor intensity and the operation risk are reduced; the system has strong adaptability to the environment, is very suitable for autonomous operation in dangerous environments such as infectious disease hospitals, does not need to be manually mapped or acquired prior data by operators entering into dangerous areas before operation, has low cost and high safety, and has high commercial popularization and market application values.
In one embodiment, the collecting the robot driving environment information in real time according to the planar laser radar or the camera module includes:
controlling the commercial robot to emit plane laser radar waves to obtain plane laser point cloud data;
performing data preprocessing on the point cloud data, wherein the preprocessing comprises noise filtering and abnormal value filtering;
performing least square fitting on the preprocessed point cloud data to obtain a linear relation of the point cloud data;
and performing dimension reduction projection according to the linear relation of the point cloud data to obtain environment information.
In this embodiment, the step of performing least square fitting on the preprocessed point cloud data to obtain the linear relationship of the point cloud data includes: performing clustering algorithm classification on the preprocessed point cloud data to further fit point clouds with high correlation degree; specifically, in an embodiment, the performing dimension-reduction projection according to the linear relationship of the point cloud data to obtain environmental information includes: and matching the dimensional-reduced projected square point cloud data with a preset regional point cloud template, and finally identifying the type of the spatial region in front of the robot. Performing smoothing processing which is helpful for linear fitting of the point cloud on the clustered vertical point cloud data; performing smoothing processing which is helpful for the point cloud to perform linear fitting on the clustered point cloud data; fitting the point cloud data by using a least square method to obtain a linear relation of the point cloud data; performing dimension reduction projection on the fitted point cloud data; and matching the front point cloud data after dimension reduction projection with a preset area point cloud template, and finally identifying the type of a space area where the front of the robot is located, wherein the area point cloud template comprises a flat area point cloud template, a right turning flat area point cloud template, a left turning flat area point cloud template, a T-shaped channel point cloud template, a closed channel point cloud template, an open area point cloud template, and the type of the space area comprises a flat area, a right turning flat area and the like.
Referring to fig. 9, the present application provides a welt recognition apparatus, the apparatus comprising:
the transverse scanning data module 10 is used for scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data;
the longitudinal scanning data module 20 is configured to scan a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data;
an obstacle avoidance module 30, configured to determine whether to perform an obstacle avoidance operation according to the lateral scan data;
a welt distance module 40, configured to obtain a welt distance of the commercial robot according to the longitudinal scan data if not executed;
and the welt running module 50 is configured to compare the minimum welt distance with a preset welt distance, and control the commercial robot to continue to keep the welt running when it is determined that the welt distance is not greater than the preset welt distance.
As described above, it will be understood that the components of the welt identification device set forth in the present application may implement the functions of any one of the above-described welt identification methods.
In one embodiment, the lateral scan data module 10 further comprises performing:
The scanning of the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data comprises the following steps:
transmitting first transverse line laser to a first triangular area in front based on a first transverse line laser radar to obtain first triangular line laser scanning data;
transmitting second transverse line laser to a second triangular region on the basis of a second transverse line laser radar to obtain second triangular line laser scanning data, wherein an intersection region exists between the first triangular region and the second triangular region;
and obtaining transverse scanning data according to the first triangular line laser scanning data and the second triangular line laser scanning data.
In one embodiment, the longitudinal scan data module 20 further comprises performing:
the scanning of the lateral area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scanning data comprises the following steps:
transmitting third longitudinal line laser to the left side area at a first angle according to the third longitudinal line laser radar to obtain third longitudinal line laser data;
transmitting fourth longitudinal line laser to the right side area at a second angle according to the fourth longitudinal line laser radar to obtain fourth longitudinal line laser data;
And obtaining longitudinal scanning data according to the third longitudinal line laser data and the fourth longitudinal line laser data.
In one embodiment, the longitudinal scan data module 20 further comprises performing:
acquiring a welt path of the commercial robot;
judging whether the direction of the welt path is anticlockwise or not;
when the welt path direction is anticlockwise, obtaining the welt distance of the commercial robot according to fourth longitudinal line laser data;
and when the welt path direction is clockwise, obtaining the welt distance of the commercial robot according to the third longitudinal line laser data.
In one embodiment, the longitudinal scan data module 20 further comprises performing:
when the welt path direction is counterclockwise, obtaining the welt distance of the commercial robot according to a fourth longitudinal line laser, including:
acquiring line laser point cloud data according to the fourth longitudinal line laser;
performing straight line fitting on the line laser point cloud data to obtain a plurality of fitted straight line segments;
and respectively acquiring first distances between the commercial robot and the fitting straight line segments, selecting the smallest first distance among all the first distances, and taking the smallest first distance as the welting distance.
In one embodiment, the welt travel module 50 further includes performing:
acquiring the running environment information of the commercial robot in real time according to a plane laser radar or a camera module;
acquiring side area information and front area information of the commercial robot according to the environment information;
and sending corresponding control instructions according to the side area information and the front area information to control the commercial robot to perform welt running according to a preset movement mode.
In one embodiment, the welt travel module 50 further includes performing:
the real-time acquisition of the robot driving environment information according to the plane laser radar or the camera module comprises the following steps:
controlling the commercial robot to emit plane laser radar waves to obtain plane laser point cloud data;
performing data preprocessing on the point cloud data, wherein the preprocessing comprises noise filtering and abnormal value filtering;
performing least square fitting on the preprocessed point cloud data to obtain a linear relation of the point cloud data;
and performing dimension reduction projection according to the linear relation of the point cloud data to obtain environment information.
Referring to fig. 10, a computer device is further provided in an embodiment of the present application, and the internal structure of the computer device may be as shown in fig. 10. The computer device includes a processor, a memory, a network interface, and a display device and an input device connected by a system bus. The network interface of the computer device is used for communicating with an external terminal through network connection. The display device of the computer device is used for displaying the interactive page. The input means of the computer device is for receiving input from a user. The computer device is designed with a processor for providing computing and control capabilities. The memory of the computer device includes a non-volatile storage medium. The non-volatile storage medium stores an operating system, computer programs, and a database. The database of the computer device is used for storing the original data. The computer program is executed by a processor to implement a welt identification method.
The method for recognizing the welt by the processor is applied to a commercial robot, and is characterized by comprising the following steps: scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data; scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data; determining whether to execute obstacle avoidance operation according to the transverse scanning data; if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data; and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance. By the method, the cleaning coverage rate of the commercial cleaning robot is improved, and intelligent welt cleaning is realized.
The application also provides a computer readable storage medium, on which a computer program is stored, which when executed by the processor, implements a welt recognition method, and scans a front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data; scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data; determining whether to execute obstacle avoidance operation according to the transverse scanning data; if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data; and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance.
The computer readable storage medium provides a welt recognition method, and the method scans the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data; scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data; determining whether to execute obstacle avoidance operation according to the transverse scanning data; if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data; and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance. By the method, the cleaning coverage rate of the commercial cleaning robot is improved, and intelligent welt cleaning is realized
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided by the present application and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual speed data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application or directly or indirectly applied to other related technical fields are included in the scope of the application.
Claims (10)
1. A welt identification method applied to a commercial robot, the method comprising:
scanning the front area of the commercial robot according to a transverse line laser radar to obtain transverse scanning data;
Scanning a lateral area of the commercial robot according to a longitudinal line laser radar to obtain longitudinal scanning data;
determining whether to execute obstacle avoidance operation according to the transverse scanning data;
if not, acquiring the welt distance of the commercial robot according to the longitudinal scanning data;
and comparing the welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not greater than the preset welt distance.
2. The method of claim 1, wherein scanning the front area of the commercial robot according to the transverse line lidar to obtain transverse scan data comprises:
transmitting first transverse line laser to a first triangular area in front based on a first transverse line laser radar to obtain first triangular line laser scanning data;
transmitting second transverse line laser to a second triangular region on the basis of a second transverse line laser radar to obtain second triangular line laser scanning data, wherein an intersection region exists between the first triangular region and the second triangular region;
and obtaining transverse scanning data according to the first triangular line laser scanning data and the second triangular line laser scanning data.
3. The method for edge bonding recognition according to claim 1, wherein the scanning the lateral area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scan data includes:
transmitting third longitudinal line laser to the left side area at a first angle according to the third longitudinal line laser radar to obtain third longitudinal line laser data;
transmitting fourth longitudinal line laser to the right side area at a second angle according to the fourth longitudinal line laser radar to obtain fourth longitudinal line laser data;
and obtaining longitudinal scanning data according to the third longitudinal line laser data and the fourth longitudinal line laser data.
4. The welt identification method of claim 3, wherein the obtaining the welt distance of the business robot based on the longitudinal scan data if not performed comprises:
acquiring a welt path of the commercial robot;
judging whether the direction of the welt path is anticlockwise or not;
when the welt path direction is anticlockwise, obtaining the welt distance of the commercial robot according to fourth longitudinal line laser data;
and when the welt path direction is clockwise, obtaining the welt distance of the commercial robot according to the third longitudinal line laser data.
5. The method of claim 4, wherein when the welt path direction is counterclockwise, obtaining the welt distance of the business robot according to a fourth longitudinal line laser comprises:
acquiring line laser point cloud data according to the fourth longitudinal line laser;
performing straight line fitting on the line laser point cloud data to obtain a plurality of fitted straight line segments;
and respectively acquiring first distances between the commercial robot and the fitting straight line segments, selecting the smallest first distance among all the first distances, and taking the smallest first distance as the welting distance.
6. The welt identification method of claim 1, wherein the controlling the commercial robot to continue the welt travel further comprises:
acquiring the running environment information of the commercial robot in real time according to a plane laser radar or a camera module;
acquiring side area information and front area information of the commercial robot according to the environment information;
and sending corresponding control instructions according to the side area information and the front area information to control the commercial robot to perform welt running according to a preset movement mode.
7. The method for edge bonding recognition according to claim 6, wherein the collecting the robot driving environment information in real time according to the planar laser radar or the camera module comprises:
controlling the commercial robot to emit plane laser radar waves to obtain plane laser point cloud data;
performing data preprocessing on the point cloud data, wherein the preprocessing comprises noise filtering and abnormal value filtering;
performing least square fitting on the preprocessed point cloud data to obtain a linear relation of the point cloud data;
and performing dimension reduction projection according to the linear relation of the point cloud data to obtain environment information.
8. A welt identification device, the device comprising:
the transverse scanning data module is used for scanning the front area of the commercial robot according to the transverse line laser radar to obtain transverse scanning data;
the longitudinal scanning data module is used for scanning the side area of the commercial robot according to the longitudinal line laser radar to obtain longitudinal scanning data;
the obstacle avoidance module is used for determining whether to execute an obstacle avoidance operation according to the transverse scanning data;
the welt distance module is used for acquiring the welt distance of the commercial robot according to the longitudinal scanning data if the welt distance module is not executed;
And the welt running module is used for comparing the minimum welt distance with a preset welt distance, and controlling the commercial robot to continue to keep the welt running when the welt distance is not larger than the preset welt distance.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the welt identification method of any of claims 1 to 7 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, carries out the steps of the welt identification method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310724714.6A CN116919247A (en) | 2023-06-19 | 2023-06-19 | Welt identification method, device, computer equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310724714.6A CN116919247A (en) | 2023-06-19 | 2023-06-19 | Welt identification method, device, computer equipment and medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116919247A true CN116919247A (en) | 2023-10-24 |
Family
ID=88385387
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310724714.6A Pending CN116919247A (en) | 2023-06-19 | 2023-06-19 | Welt identification method, device, computer equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116919247A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117805853A (en) * | 2024-02-29 | 2024-04-02 | 钛玛科(北京)工业科技有限公司 | Laser reflection type edge detection device and method |
-
2023
- 2023-06-19 CN CN202310724714.6A patent/CN116919247A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117805853A (en) * | 2024-02-29 | 2024-04-02 | 钛玛科(北京)工业科技有限公司 | Laser reflection type edge detection device and method |
CN117805853B (en) * | 2024-02-29 | 2024-05-14 | 钛玛科(北京)工业科技有限公司 | Laser reflection type edge detection device and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020103533A1 (en) | Track and road obstacle detecting method | |
CN108007452B (en) | Method and device for updating environment map according to obstacle and robot | |
US7916898B2 (en) | Method and system for identifying an edge of a crop | |
JP2501010B2 (en) | Mobile robot guidance device | |
US10664974B2 (en) | System and method for object detection using edge characteristics | |
US20040039498A1 (en) | System and method for the creation of a terrain density model | |
CN110499727B (en) | Multi-sensor-based welt sweeping method and sweeper | |
US20210018611A1 (en) | Object detection system and method | |
WO2008070205A2 (en) | Obstacle detection arrangements in and for autonomous vehicles | |
KR102391771B1 (en) | Method for operation unmanned moving vehivle based on binary 3d space map | |
CN116919247A (en) | Welt identification method, device, computer equipment and medium | |
CN113777622B (en) | Rail obstacle identification method and device | |
WO2019112514A1 (en) | Rain filtering techniques for autonomous vehicle | |
EP3882663A1 (en) | Spatial and temporal processing of ultrasonic-sensor detections for mapping in vehicle-parking-assist functions | |
CN108873014A (en) | Mirror surface detection method and device based on laser radar | |
Steinbaeck et al. | Occupancy grid fusion of low-level radar and time-of-flight sensor data | |
JP3070277B2 (en) | Preceding vehicle detection device | |
CN115485582A (en) | Method and device for detecting halos in lidar measurements | |
Clarke et al. | Sensor modelling for radar-based occupancy mapping | |
Zhong et al. | Dynamic obstacle detection and tracking based on 3D lidar | |
Rosinski et al. | Terrain map building for a walking robot equipped with an active 2D range sensor | |
CN115429158A (en) | Method and device for determining initial cleaning area of sweeper, computer equipment and medium | |
US20240242432A1 (en) | Method and device for determining concealed objects in a 3d point cloud representing an environment | |
US11914679B2 (en) | Multispectral object-detection with thermal imaging | |
CN215457649U (en) | Floor sweeper |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |