CN112162294A - Robot structure detection method based on laser sensor - Google Patents

Robot structure detection method based on laser sensor Download PDF

Info

Publication number
CN112162294A
CN112162294A CN202011080558.7A CN202011080558A CN112162294A CN 112162294 A CN112162294 A CN 112162294A CN 202011080558 A CN202011080558 A CN 202011080558A CN 112162294 A CN112162294 A CN 112162294A
Authority
CN
China
Prior art keywords
laser
data
frame
robot
robot body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011080558.7A
Other languages
Chinese (zh)
Other versions
CN112162294B (en
Inventor
谢传泉
浦剑涛
张东泉
张志尚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Bucos Robot Co ltd
Shenzhen Boocax Technology Co ltd
Beijing Boocax Technology Co ltd
Original Assignee
Shandong Bucos Robot Co ltd
Shenzhen Boocax Technology Co ltd
Beijing Boocax Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Bucos Robot Co ltd, Shenzhen Boocax Technology Co ltd, Beijing Boocax Technology Co ltd filed Critical Shandong Bucos Robot Co ltd
Priority to CN202011080558.7A priority Critical patent/CN112162294B/en
Publication of CN112162294A publication Critical patent/CN112162294A/en
Application granted granted Critical
Publication of CN112162294B publication Critical patent/CN112162294B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Abstract

The embodiment of the disclosure provides a robot structure detection method, a device, equipment and a computer readable storage medium based on a laser sensor. The method includes receiving a laser data frame; comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, taking the currently received laser data frame as an available laser frame; acquiring a preset acquisition number of available laser frames, and screening out robot body structure data according to the variance of the preset acquisition number of available laser frames; and carrying out segmentation processing on the robot body structure data to determine the robot body structure. In this way, invalid data returned by the structure can be filtered, and the effective perception of the laser sensor to the environment is ensured.

Description

Robot structure detection method based on laser sensor
Technical Field
Embodiments of the present disclosure relate generally to the field of robot detection, and more particularly, to a method, an apparatus, a device, and a computer-readable storage medium for detecting a robot structure based on a laser sensor.
Background
The existing robot chassis is mostly subjected to environment sensing, such as positioning and composition, by laser sensors, and therefore in order to guarantee a laser detection range, a laser installation area is emptied of shielding parts as far as possible.
However, due to the fact that the upper layer of the chassis needs to be supported and special marks are made for the purpose of being convenient to identify between the robots, certain shielding is caused to the laser, and the part of data influences both composition and robot positioning.
Disclosure of Invention
According to an embodiment of the present disclosure, a robot structure detection scheme based on a laser sensor is provided.
In a first aspect of the present disclosure, a method for detecting a robot structure based on a laser sensor is provided. The method comprises the following steps: receiving a laser data frame;
comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, taking the currently received laser data frame as an available laser frame;
acquiring a preset acquisition number of available laser frames, and screening out robot body structure data according to the variance of the preset acquisition number of available laser frames;
and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
Optionally, the comparing the currently received laser data frame with the last stored available laser data frame, and if the changed value is greater than the preset moving threshold, taking the currently received laser data frame as the available laser frame includes:
respectively storing the acquired first frame of laser data into an available laser frame container and a structural body of reference laser data;
continuously acquiring a laser data frame, comparing the currently acquired laser data frame with a laser data frame stored in the structural body of the reference laser data, and if the changed numerical value is greater than a moving threshold, taking the currently acquired laser data frame as an available laser frame;
storing the currently-acquired laser data frame as an available laser frame into the available laser frame container, and replacing laser frame data in the structure of reference laser data with the currently-acquired laser data frame as an available laser frame.
Optionally, if the changed number is smaller than the moving threshold, moving the robot and reacquiring the laser data frame.
Optionally, the comparing the currently acquired laser data frame with the laser data frame stored in the structural body of the reference laser data includes:
comparing the currently acquired laser data frame with the laser data frame stored in the structural body of the reference laser data;
and if the distance difference in the same direction exceeds the dynamic distance threshold, judging that the currently acquired laser data frame is a changed frame, and acquiring the number of changed values of the changed frame.
Optionally, the acquiring of the available laser frames with the preset acquisition number, and screening out the robot body structure data according to the variance of the available laser frames with the preset acquisition number includes:
extracting available laser frames with the same measurement direction and preset collection number from the available laser frame container;
calculating the variance of the available laser frames with the same measurement direction of the preset collection number;
screening the value of the variance, wherein if the value of the variance is smaller than a variance threshold value, the value of the variance corresponds to the structural data of the robot body;
comparing the robot ontology structure data with a robot ontology structure limit threshold;
if the robot body structure data is smaller than the robot body structure limit threshold, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the value into a structural body of the robot body structure data;
and if the robot body structure data is larger than or equal to the robot body structure limit threshold, setting the value of the variance corresponding to the robot body structure data to be 0, and storing the value into a structural body of the robot body structure data.
Optionally, the segmenting the robot ontology structure data, and determining the robot ontology structure includes:
carrying out sectional processing on data in the structural body of the preset robot body structural data, and dividing the structural body which is continuously 1 into one section;
expanding the two ends of each section outwards by a number with a preset expansion threshold value of 1;
and merging the expanded data segments to determine the robot body structure.
There is further provided an implementation of the above-described aspects and any possible implementations.
In a second aspect of the present disclosure, a laser sensor based robotic structure detecting device is provided. The device includes:
the receiving module is used for receiving the laser data frame;
the comparison module is used for comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, the currently received laser data frame is used as the available laser frame;
the screening module is used for acquiring the available laser frames with the preset acquisition number and screening out the structural data of the robot body according to the variance of the available laser frames with the preset acquisition number;
and the determining module is used for carrying out segmentation processing on the robot body structure data to determine the robot body structure.
In a third aspect of the disclosure, an electronic device is provided. The electronic device includes: a memory having a computer program stored thereon and a processor implementing the method as described above when executing the program.
In a fourth aspect of the present disclosure, a computer readable storage medium is provided, having stored thereon a computer program, which when executed by a processor, implements a method as in accordance with the first aspect of the present disclosure.
According to the robot structure detection method based on the laser sensor, provided by the embodiment of the application, the laser data frame is received; comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, taking the currently received laser data frame as an available laser frame; acquiring a preset acquisition number of available laser frames, and screening out robot body structure data according to the variance of the preset acquisition number of available laser frames; and carrying out segmentation processing on the robot body structure data to determine the robot body structure. Invalid data returned by the structure can be filtered, effective sensing of the laser sensor to the environment is guaranteed, and detection accuracy of the robot is improved.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, like or similar reference characters designate like or similar elements, and wherein:
fig. 1 shows a flow diagram of a laser sensor based robotic structure detection method according to an embodiment of the present disclosure;
FIG. 2 illustrates a block diagram of a laser sensor based robotic structure detecting device, in accordance with an embodiment of the present disclosure;
FIG. 3 illustrates a block diagram of an exemplary electronic device capable of implementing embodiments of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
In this disclosure, (this embodiment and its advantages may be summarized simply by one paragraph, possibly not simply repeated with the claims).
Fig. 1 shows a flow diagram of laser sensor based robotic structure detection according to an embodiment of the present disclosure.
S110, receiving the laser data frame.
The robot is provided with a laser sensor.
Optionally, the laser sensor includes a rotatable transmitting device, the transmitting device emits a laser beam after rotating a certain angle, the emitted laser beam is reflected after encountering an object, and the reflected light is received by the optical receiving system, so that the distance from the laser scanner to the object can be measured.
The laser data frame is received when the robot starts to perform a task (start), i.e., performs a rotation, displacement operation (move).
Optionally, the laser data frame includes a first frame of laser data received after the robot is started and laser data received after the robot moves.
And S120, comparing the currently received laser data frame with the last stored available laser data frame, and if the changed value is greater than a preset moving threshold value, taking the currently received laser data frame as the available laser frame.
Optionally, setting an available laser frame container valid _ vecPnt for accessing laser frame data when the robot changes;
setting a structure refer _ Stru of reference laser data for comparison between laser frame data;
setting a dynamic distance threshold value P0 for defining whether the current laser data is a variation value or not, and avoiding position variation caused by the self-bounce of the laser data, wherein the value range of P0 is generally 10-30cm, for example 15 cm;
setting a movement threshold P1 for judging whether the current laser data meets the large-amplitude movement condition, wherein the value range of P1 is generally 1/3-1/2 of the currently used laser sensor for collecting points, for example, when the laser sensor collects the points as 3000, the value of P1 can be 1000 or 1500;
optionally, storing the received first frame of laser data into the valid _ vecPnt and the refer _ Stru, respectively;
continuously receiving laser data frames, comparing the currently received laser data frames with the laser data frames stored in the refer _ Stru, if the distance difference in the same direction exceeds P0, judging the currently obtained laser data frames as changed frames, and obtaining the number of changed values of the changed frames; that is, the positions of the light spots in the currently acquired laser data frame (in this embodiment, one frame of laser data has about 3000 light spots) and the positions of the light spots in the laser data frame stored in the refer _ Stru are compared one by one, and if the offset difference (misalignment) of the light spot positions in the same direction exceeds P0, it is determined that the currently acquired laser data frame is a changed frame (displacement change occurs), and the number of changed values, that is, the number of misaligned light spots, is acquired;
if the changed number (number of misaligned spots) is greater than P1, the currently-acquired laser data frame is used as an available laser frame, the currently-acquired laser data frame as an available laser frame is stored in the valid _ vecPnt, and the currently-acquired laser data frame as an available laser frame is substituted for the laser frame data in the defer _ Stru.
And if the changed numerical value is less than or equal to the P1, moving the robot, and acquiring the laser data frame again, wherein the valid _ vecPnt and the defer _ Stru do not change any value.
S130, acquiring the available laser frames with the preset acquisition number, and screening out the structural data of the robot body according to the variance of the available laser frames with the preset acquisition number.
And receiving available laser frames and judging the chassis structure.
An acquisition available laser frame threshold P2 is set for defining the number of laser data acquisitions. The value range of P2 is generally a positive integer greater than or equal to 50, for example, 50, that is, usually 50 frames are selected to meet the requirement of structure detection.
A structural body robstru _ Stru for storing the structural data of the robot body is arranged and used for representing the structure of the robot body.
And setting a structural body var _ Stru for storing laser data variance data, wherein the structural body var _ Stru is used for storing the variance data calculated by the available laser frame calculation.
A variance threshold P3 is set for defining whether the numerical representation is a robot body structure. The value range of the P3 is generally 5-20 cm. For example 10 cm.
And setting a robot body structure limit threshold P4 for filtering data exceeding the threshold. Generally, the radius of the robot body is not too large, so that one piece of data is set to restrict the detected structure value. The value range of the P4 is generally 1-3 m.
Optionally, step S120 is repeated until the number of available laser data frames acquired reaches said P2. That is, step 120 is repeated, and when the number of available laser frames stored in the valid _ vecPnt reaches P2, data acquisition is stopped, and most of the laser frames (available laser frames) in the container are key frames with large changes, which have satisfied the data demand for laser structure determination.
Optionally, all available laser frames with the same measurement direction are extracted from the valid _ vecPnt respectively;
calculating the variance of all available laser frames with the same measurement direction, and storing the value of the variance into the var _ Stru;
extracting variance data from the var _ Stru, screening the variance value, and if the variance value is smaller than the P3, the variance value corresponds to the robot body structure data;
comparing the robot ontology structural data with the P4;
if the robot body structure data is smaller than the P4, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the variance into the robstru _ Stru;
and if the robot body structure data is greater than or equal to the P4, setting the value of the variance corresponding to the robot body structure data to be 0, and storing the variance into the robstru _ Stru.
Optionally, in the robstru _ Stru, 1 represents a robot body structure, and 0 represents no structural occlusion in this direction.
And S140, carrying out segmentation processing on the robot body structure data to determine the robot body structure.
When the laser spot hits an edge object, the laser data is unstable and the variance is too large, that is, the detection is not accurate.
Therefore, in this embodiment, the robot body structure data obtained in step S130 is segmented, and both ends of the segmented data are expanded by a certain area, so as to solve this problem.
Specifically, a structural body final _ robstru _ Stru for storing the structural data of the robot body is arranged and used for representing the structure of the robot body;
setting an expansion threshold P5 for filling the structural edge of the robot body; the P5 is determined according to the angular resolution of the laser sensor of the robot arrangement, typically extended by 3-5 °, e.g. 4 °. That is, the number of points that can be extended according to the angular resolution (P5);
and carrying out segmentation processing on the data in the robstru _ Stru, and dividing the data which is continuously 1 into one segment. And expands the P5 numbers with a value of 1 outwards for both ends of each segment. And finally, merging the expanded data segments together and saving the merged data segments into the final _ robstru _ Stru (a final robot body structure table), and determining the robot body structure.
It should be noted that the range and/or number of the threshold values (parameters) set in the present embodiment may be set according to the actual application scenario and/or the type of the robot.
According to the robot structure detection method based on the laser sensor, whether the robot moves in a large direction or not is detected through the laser, if the robot moves in the large direction, the currently received laser frame is judged to be an available laser frame, a certain threshold number of available laser frames are received, variance is calculated, robot body structure data are screened out, the robot body structure data are subjected to segmentation processing, and a final robot structure is determined. Invalid data returned by the structure can be filtered, effective sensing of the laser sensor to the environment is guaranteed, and detection accuracy of the robot is improved.
It is noted that while for simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently. Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
The above is a description of embodiments of the method, and the embodiments of the apparatus are further described below.
Fig. 2 illustrates a block diagram of a laser sensor based robotic structure detecting device 200 according to an embodiment of the present disclosure. As shown in fig. 2, the apparatus 200 includes:
a receiving module 210, configured to receive a laser data frame;
a comparison module 220, configured to compare a currently received laser data frame with an available laser data frame stored last time, and if a changed value is greater than a preset movement threshold, take the currently received laser data frame as an available laser frame;
the screening module 230 is configured to acquire a preset number of available laser frames, and screen out robot body structure data according to a variance of the preset number of available laser frames;
and the determining module 240 is configured to perform segmentation processing on the robot ontology structure data to determine a robot ontology structure.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the described module may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Reference is now made to fig. 3, which illustrates a schematic block diagram of a computer system suitable for implementing a terminal device or server of an embodiment of the present application. The terminal device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 3, the computer system includes a Central Processing Unit (CPU)301 that can perform various appropriate actions and processes based on a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for system operation are also stored. The CPU 301, ROM 302, and RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305: an input portion 306 including a keyboard, a mouse, and the like; an output section 307 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 308 including a hard disk and the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The driver 310 is also connected to the I/O interface 305 on an as needed basis. A removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 310 on an as-needed basis, so that a computer program read out therefrom is mounted on the storage section 308 on an as-needed basis.
In particular, based on the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 309, and/or installed from the removable medium 311. The computer program performs the above-described functions defined in the method of the present application when executed by the Central Processing Unit (CPU) 301.
It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a unit, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an information measuring unit, a travel track determining unit, a mapping relation determining unit, and a driving strategy generating unit. Here, the names of these units do not constitute a limitation on the unit itself in some cases, and for example, the information measuring unit may also be described as a "unit that measures the state information of the own vehicle and the surrounding scene information".
As another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not incorporated into the terminal. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: receiving a laser data frame; comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, taking the currently received laser data frame as an available laser frame; acquiring a preset acquisition number of available laser frames, and screening out robot body structure data according to the variance of the preset acquisition number of available laser frames; and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (9)

1. A robot structure detection method based on a laser sensor is characterized by comprising the following steps:
receiving a laser data frame;
comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, taking the currently received laser data frame as an available laser frame;
acquiring a preset acquisition number of available laser frames, and screening out robot body structure data according to the variance of the preset acquisition number of available laser frames;
and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
2. The method of claim 1, wherein comparing the currently received laser data frame with a last stored available laser data frame, and if the changed value is greater than a preset motion threshold, taking the currently received laser data frame as an available laser frame comprises:
respectively storing the acquired first frame of laser data into an available laser frame container and a structural body of reference laser data;
continuously acquiring a laser data frame, comparing the currently acquired laser data frame with a laser data frame stored in the structural body of the reference laser data, and if the changed numerical value is greater than a moving threshold, taking the currently acquired laser data frame as an available laser frame;
storing the currently-acquired laser data frame as an available laser frame into the available laser frame container, and replacing laser frame data in the structure of reference laser data with the currently-acquired laser data frame as an available laser frame.
3. The method of claim 2,
and if the changed numerical value is less than the moving threshold, moving the robot and acquiring the laser data frame again.
4. The method of claim 3, wherein comparing the currently acquired frame of laser data to the frame of laser data stored in the structure of reference laser data comprises:
comparing the currently acquired laser data frame with the laser data frame stored in the structural body of the reference laser data;
and if the distance difference in the same direction exceeds the dynamic distance threshold, judging that the currently acquired laser data frame is a changed frame, and acquiring the number of changed values of the changed frame.
5. The method of claim 4, wherein the obtaining a preset collection number of available laser frames, and the screening out the robot ontology structural data according to the variance of the preset collection number of available laser frames comprises:
extracting available laser frames with the same measurement direction and preset collection number from the available laser frame container;
calculating the variance of the available laser frames with the same measurement direction of the preset collection number;
screening the value of the variance, wherein if the value of the variance is smaller than a variance threshold value, the value of the variance corresponds to the structural data of the robot body;
comparing the robot ontology structure data with a robot ontology structure limit threshold;
if the robot body structure data is smaller than the robot body structure limit threshold, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the value into a structural body of the robot body structure data;
and if the robot body structure data is larger than or equal to the robot body structure limit threshold, setting the value of the variance corresponding to the robot body structure data to be 0, and storing the value into a structural body of the robot body structure data.
6. The method of claim 5, wherein segmenting the robot ontology structure data and determining the robot ontology structure comprises:
carrying out sectional processing on data in the structural body of the preset robot body structural data, and dividing the structural body which is continuously 1 into one section;
expanding the two ends of each section outwards by a number with a preset expansion threshold value of 1;
and merging the expanded data segments to determine the robot body structure.
7. A robot structure detection device based on a laser sensor is characterized by comprising:
the receiving module is used for receiving the laser data frame;
the comparison module is used for comparing the currently received laser data frame with the last stored available laser data frame, and if the changed numerical value is greater than a preset moving threshold value, the currently received laser data frame is used as the available laser frame;
the screening module is used for acquiring the available laser frames with the preset acquisition number and screening out the structural data of the robot body according to the variance of the available laser frames with the preset acquisition number;
and the determining module is used for carrying out segmentation processing on the robot body structure data to determine the robot body structure.
8. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, wherein the processor, when executing the program, implements the method of any of claims 1-6.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method of any one of claims 1 to 6.
CN202011080558.7A 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor Active CN112162294B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011080558.7A CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011080558.7A CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Publications (2)

Publication Number Publication Date
CN112162294A true CN112162294A (en) 2021-01-01
CN112162294B CN112162294B (en) 2023-12-15

Family

ID=73868018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011080558.7A Active CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Country Status (1)

Country Link
CN (1) CN112162294B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112986912A (en) * 2021-03-19 2021-06-18 北京小狗吸尘器集团股份有限公司 Floor sweeper repositioning method and device based on structured light sensor and electronic equipment
CN113006634A (en) * 2021-03-30 2021-06-22 北京布科思科技有限公司 Automatic door opening and closing method of robot based on door identification

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6308114B1 (en) * 1999-04-20 2001-10-23 In-Kwang Kim Robot apparatus for detecting direction of sound source to move to sound source and method for operating the same
US20060271238A1 (en) * 2005-05-24 2006-11-30 Samsung Electronics Co., Ltd. Network-based robot control system and robot velocity control method in the network-based robot control system
US20070019181A1 (en) * 2003-04-17 2007-01-25 Sinclair Kenneth H Object detection system
US20070267570A1 (en) * 2006-05-17 2007-11-22 Samsung Electronics Co., Ltd. Method of detecting object using structured light and robot using the same
CN102449568A (en) * 2009-06-01 2012-05-09 株式会社日立制作所 Robot control system, robot control terminal, robot control method and program
US20130138247A1 (en) * 2005-03-25 2013-05-30 Jens-Steffen Gutmann Re-localization of a robot for slam
KR20130099667A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Device and method for estimating location of mobile robot using raiser scanner and structure
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US20150166060A1 (en) * 2013-12-18 2015-06-18 Irobot Corporation Autonomous Mobile Robot
CN106767827A (en) * 2016-12-29 2017-05-31 浙江大学 A kind of mobile robot point cloud map creating method based on laser data
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
CN107817509A (en) * 2017-09-07 2018-03-20 上海电力学院 Crusing robot navigation system and method based on the RTK Big Dippeves and laser radar
CN107907124A (en) * 2017-09-30 2018-04-13 杭州迦智科技有限公司 The localization method known again based on scene, electronic equipment, storage medium, system
CN108427097A (en) * 2018-01-26 2018-08-21 深圳班翟机器人有限公司 Linear array sensor devices laser positioning method, apparatus, terminal and computer readable storage medium
US20180321360A1 (en) * 2017-05-08 2018-11-08 Velodyne Lidar, Inc. LIDAR Data Acquisition And Control
US20180326582A1 (en) * 2017-05-11 2018-11-15 Kabushiki Kaisha Yaskawa Denki Robot, control method of robot, and machining method of workpiece
CN109459039A (en) * 2019-01-08 2019-03-12 湖南大学 A kind of the laser positioning navigation system and its method of medicine transfer robot
CN109459777A (en) * 2018-11-21 2019-03-12 北京木业邦科技有限公司 A kind of robot, robot localization method and its storage medium
CN109614459A (en) * 2019-03-06 2019-04-12 上海思岚科技有限公司 Map structuring winding detection method and equipment applied to two-dimensional laser
CN209086755U (en) * 2018-12-11 2019-07-09 上海智臻智能网络科技股份有限公司 A kind of robot and its control system
WO2019140745A1 (en) * 2018-01-16 2019-07-25 广东省智能制造研究所 Robot positioning method and device
CN110071582A (en) * 2019-05-14 2019-07-30 北京国网富达科技发展有限责任公司 Charging mechanism and its method for transmission line polling robot
CN110196044A (en) * 2019-05-28 2019-09-03 广东亿嘉和科技有限公司 It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method
CN110340935A (en) * 2018-04-03 2019-10-18 深圳市神州云海智能科技有限公司 A kind of method and robot of robot fusion positioning
US20190390952A1 (en) * 2018-06-26 2019-12-26 Seiko Epson Corporation Three-Dimensional Measuring Device, Controller, And Robot System
CN110839206A (en) * 2019-11-26 2020-02-25 北京布科思科技有限公司 Positioning method and device based on dual-mode tag
CN111220967A (en) * 2020-01-02 2020-06-02 小狗电器互联网科技(北京)股份有限公司 Method and device for detecting data validity of laser radar
US20200256999A1 (en) * 2019-02-07 2020-08-13 Atulya Yellepeddi Lidar techniques for autonomous vehicles
CN111555817A (en) * 2020-05-09 2020-08-18 国网江苏省电力有限公司无锡供电分公司 Differential modulation safety optical communication method and device based on coherent optical system
CN111736524A (en) * 2020-07-17 2020-10-02 北京布科思科技有限公司 Multi-robot scheduling method, device and equipment based on time and space

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6308114B1 (en) * 1999-04-20 2001-10-23 In-Kwang Kim Robot apparatus for detecting direction of sound source to move to sound source and method for operating the same
US20070019181A1 (en) * 2003-04-17 2007-01-25 Sinclair Kenneth H Object detection system
US20130138247A1 (en) * 2005-03-25 2013-05-30 Jens-Steffen Gutmann Re-localization of a robot for slam
US20060271238A1 (en) * 2005-05-24 2006-11-30 Samsung Electronics Co., Ltd. Network-based robot control system and robot velocity control method in the network-based robot control system
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
US20070267570A1 (en) * 2006-05-17 2007-11-22 Samsung Electronics Co., Ltd. Method of detecting object using structured light and robot using the same
CN102449568A (en) * 2009-06-01 2012-05-09 株式会社日立制作所 Robot control system, robot control terminal, robot control method and program
KR20130099667A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Device and method for estimating location of mobile robot using raiser scanner and structure
US20150166060A1 (en) * 2013-12-18 2015-06-18 Irobot Corporation Autonomous Mobile Robot
CN106767827A (en) * 2016-12-29 2017-05-31 浙江大学 A kind of mobile robot point cloud map creating method based on laser data
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
US20180321360A1 (en) * 2017-05-08 2018-11-08 Velodyne Lidar, Inc. LIDAR Data Acquisition And Control
US20180326582A1 (en) * 2017-05-11 2018-11-15 Kabushiki Kaisha Yaskawa Denki Robot, control method of robot, and machining method of workpiece
CN107817509A (en) * 2017-09-07 2018-03-20 上海电力学院 Crusing robot navigation system and method based on the RTK Big Dippeves and laser radar
CN107907124A (en) * 2017-09-30 2018-04-13 杭州迦智科技有限公司 The localization method known again based on scene, electronic equipment, storage medium, system
WO2019140745A1 (en) * 2018-01-16 2019-07-25 广东省智能制造研究所 Robot positioning method and device
CN108427097A (en) * 2018-01-26 2018-08-21 深圳班翟机器人有限公司 Linear array sensor devices laser positioning method, apparatus, terminal and computer readable storage medium
CN110340935A (en) * 2018-04-03 2019-10-18 深圳市神州云海智能科技有限公司 A kind of method and robot of robot fusion positioning
US20190390952A1 (en) * 2018-06-26 2019-12-26 Seiko Epson Corporation Three-Dimensional Measuring Device, Controller, And Robot System
CN109459777A (en) * 2018-11-21 2019-03-12 北京木业邦科技有限公司 A kind of robot, robot localization method and its storage medium
CN209086755U (en) * 2018-12-11 2019-07-09 上海智臻智能网络科技股份有限公司 A kind of robot and its control system
CN109459039A (en) * 2019-01-08 2019-03-12 湖南大学 A kind of the laser positioning navigation system and its method of medicine transfer robot
US20200256999A1 (en) * 2019-02-07 2020-08-13 Atulya Yellepeddi Lidar techniques for autonomous vehicles
CN109614459A (en) * 2019-03-06 2019-04-12 上海思岚科技有限公司 Map structuring winding detection method and equipment applied to two-dimensional laser
CN110071582A (en) * 2019-05-14 2019-07-30 北京国网富达科技发展有限责任公司 Charging mechanism and its method for transmission line polling robot
CN110196044A (en) * 2019-05-28 2019-09-03 广东亿嘉和科技有限公司 It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method
CN110839206A (en) * 2019-11-26 2020-02-25 北京布科思科技有限公司 Positioning method and device based on dual-mode tag
CN111220967A (en) * 2020-01-02 2020-06-02 小狗电器互联网科技(北京)股份有限公司 Method and device for detecting data validity of laser radar
CN111555817A (en) * 2020-05-09 2020-08-18 国网江苏省电力有限公司无锡供电分公司 Differential modulation safety optical communication method and device based on coherent optical system
CN111736524A (en) * 2020-07-17 2020-10-02 北京布科思科技有限公司 Multi-robot scheduling method, device and equipment based on time and space

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李扬;苏海;余松森;曹一波;刘好新;: "基于动态阈值分段的激光扫描数据过滤算法", 激光与红外, no. 08 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112986912A (en) * 2021-03-19 2021-06-18 北京小狗吸尘器集团股份有限公司 Floor sweeper repositioning method and device based on structured light sensor and electronic equipment
CN112986912B (en) * 2021-03-19 2023-06-16 北京小狗吸尘器集团股份有限公司 Floor sweeper repositioning method and device based on structured light sensor and electronic equipment
CN113006634A (en) * 2021-03-30 2021-06-22 北京布科思科技有限公司 Automatic door opening and closing method of robot based on door identification

Also Published As

Publication number Publication date
CN112162294B (en) 2023-12-15

Similar Documents

Publication Publication Date Title
US20210190513A1 (en) Navigation map updating method and apparatus and robot using the same
CN109212530B (en) Method and apparatus for determining velocity of obstacle
US11506769B2 (en) Method and device for detecting precision of internal parameter of laser radar
CN109212532B (en) Method and apparatus for detecting obstacles
CN110019609B (en) Map updating method, apparatus and computer readable storage medium
US6658150B2 (en) Image recognition system
CN107123142B (en) Pose estimation method and device
CN111563450B (en) Data processing method, device, equipment and storage medium
CN110276293B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN112162294A (en) Robot structure detection method based on laser sensor
KR102391205B1 (en) Apparatus for estimating distance based object detection
CN110263714B (en) Lane line detection method, lane line detection device, electronic device, and storage medium
CN110341621B (en) Obstacle detection method and device
CN111750804A (en) Object measuring method and device
CN108319931B (en) Image processing method and device and terminal
CN110853085A (en) Semantic SLAM-based mapping method and device and electronic equipment
CN112193241A (en) Automatic parking method
CN111402326A (en) Obstacle detection method and device, unmanned vehicle and storage medium
CN110426714B (en) Obstacle identification method
CN114528941A (en) Sensor data fusion method and device, electronic equipment and storage medium
CN116152208A (en) Defect detection method, device, equipment and storage medium
CN115661230A (en) Estimation method for warehouse material volume
CN112344966B (en) Positioning failure detection method and device, storage medium and electronic equipment
CN117075135B (en) Vehicle feature detection method, system, storage medium and electronic equipment
CN114407887B (en) Curve recognition method, apparatus, vehicle and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant