CN112162294B - Robot structure detection method based on laser sensor - Google Patents

Robot structure detection method based on laser sensor Download PDF

Info

Publication number
CN112162294B
CN112162294B CN202011080558.7A CN202011080558A CN112162294B CN 112162294 B CN112162294 B CN 112162294B CN 202011080558 A CN202011080558 A CN 202011080558A CN 112162294 B CN112162294 B CN 112162294B
Authority
CN
China
Prior art keywords
laser
data
frame
robot body
body structure
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011080558.7A
Other languages
Chinese (zh)
Other versions
CN112162294A (en
Inventor
谢传泉
浦剑涛
张东泉
张志尚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Boocax Technology Co ltd
Original Assignee
Shandong Bucos Robot Co ltd
Shenzhen Boocax Technology Co ltd
Beijing Boocax Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Bucos Robot Co ltd, Shenzhen Boocax Technology Co ltd, Beijing Boocax Technology Co ltd filed Critical Shandong Bucos Robot Co ltd
Priority to CN202011080558.7A priority Critical patent/CN112162294B/en
Publication of CN112162294A publication Critical patent/CN112162294A/en
Application granted granted Critical
Publication of CN112162294B publication Critical patent/CN112162294B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Embodiments of the present disclosure provide a robot structure detection method, apparatus, device, and computer-readable storage medium based on a laser sensor. The method includes receiving a frame of laser data; comparing the currently received laser data frame with the available laser data frame stored last time, and taking the currently received laser data frame as the available laser frame if the changed numerical value is greater than the preset movement threshold value; acquiring available laser frames of a preset acquisition number, and screening out robot body structure data according to the variance of the available laser frames of the preset acquisition number; and carrying out segmentation processing on the robot body structure data to determine the robot body structure. In this way, invalid data returned by the structure can be filtered out, and the laser sensor can be guaranteed to effectively sense the environment.

Description

Robot structure detection method based on laser sensor
Technical Field
Embodiments of the present disclosure relate generally to the field of robot detection and, more particularly, relate to a laser sensor-based robot structure detection method, apparatus, device, and computer-readable storage medium.
Background
The existing robot chassis is mainly used for sensing environments, such as positioning and composition, by using laser sensors, so that a laser installation area is emptied as much as possible for ensuring a laser detection range.
However, because the upper layer of the chassis needs to be supported and special marks can be made for easy identification among robots, a certain shielding effect can be caused on laser, and the data of the part influence the composition and the positioning of the robots.
Disclosure of Invention
According to an embodiment of the present disclosure, a robot structure detection scheme based on a laser sensor is provided.
In a first aspect of the present disclosure, a robot structure detection method based on a laser sensor is provided. The method comprises the following steps: receiving a laser data frame;
comparing the currently received laser data frame with the available laser data frame stored last time, and taking the currently received laser data frame as the available laser frame if the changed numerical value is greater than the preset movement threshold value;
acquiring available laser frames of a preset acquisition number, and screening out robot body structure data according to the variance of the available laser frames of the preset acquisition number;
and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
Optionally, the comparing the currently received laser data frame with the last stored available laser data frame, and if the changed value is greater than the preset movement threshold, using the currently received laser data frame as the available laser frame includes:
storing the acquired first frame of laser data into a usable laser frame container and a structure body for referencing the laser data respectively;
continuously acquiring a laser data frame, comparing the currently acquired laser data frame with a laser data frame stored in a structure body of the reference laser data, and taking the currently acquired laser data frame as an available laser frame if the number of changed values is greater than a movement threshold value;
storing the currently acquired laser data frame as an available laser frame into the available laser frame container, and replacing laser frame data in a structure of the reference laser data with the currently acquired laser data frame as an available laser frame.
Optionally, if the number of changed values is smaller than a movement threshold, the mobile robot re-acquires the laser data frame.
Optionally, the comparing the currently acquired laser data frame with the laser data frame stored in the structure of the reference laser data includes:
comparing the currently acquired laser data frame with the laser data frame stored in the structure body of the reference laser data;
if the distance difference in the same direction exceeds the dynamic distance threshold, judging the currently acquired laser data frame as a change frame, and acquiring the number of changed values of the change frame.
Optionally, the acquiring the available laser frames of the preset collection number, and screening the robot body structure data according to the variance of the available laser frames of the preset collection number includes:
extracting available laser frames with the same measurement direction of a preset collection number from the available laser frame container;
solving the variance of the available laser frames with the same measuring direction of the preset acquisition number;
screening the variance value, and if the variance value is smaller than a variance threshold value, the variance value corresponds to the robot body structure data;
comparing the robot body structure data with a robot body structure limit threshold;
if the robot body structure data is smaller than the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the value into a structure body of the robot body structure data;
and if the robot body structure data is greater than or equal to the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to 0, and storing the value into a structure body of the robot body structure data.
Optionally, the step of performing segmentation processing on the robot body structure data, and the step of determining the robot body structure includes:
carrying out segmentation processing on the data in the structure body of the preset robot body structure data, and dividing the continuous 1 into one segment;
respectively expanding the two ends of each section outwards to obtain a preset expansion threshold value with a number of 1;
and merging the expanded data segments to determine the robot body structure.
Aspects and any one of the possible implementations as described above, further provide an implementation.
In a second aspect of the present disclosure, a robot structure detection device based on a laser sensor is provided. The device comprises:
the receiving module is used for receiving the laser data frame;
the comparison module is used for comparing the currently received laser data frame with the available laser data frame stored last time, and if the changed numerical value is greater than the preset movement threshold value, the currently received laser data frame is used as the available laser frame;
the screening module is used for acquiring the available laser frames with the preset acquisition number, and screening out the robot body structure data according to the variance of the available laser frames with the preset acquisition number;
and the determining module is used for carrying out segmentation processing on the robot body structure data to determine the robot body structure.
In a third aspect of the present disclosure, an electronic device is provided. The electronic device includes: a memory and a processor, the memory having stored thereon a computer program, the processor implementing the method as described above when executing the program.
In a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method as according to the first aspect of the present disclosure.
The robot structure detection method based on the laser sensor provided by the embodiment of the application is characterized by receiving a laser data frame; comparing the currently received laser data frame with the available laser data frame stored last time, and taking the currently received laser data frame as the available laser frame if the changed numerical value is greater than the preset movement threshold value; acquiring available laser frames of a preset acquisition number, and screening out robot body structure data according to the variance of the available laser frames of the preset acquisition number; and carrying out segmentation processing on the robot body structure data to determine the robot body structure. Invalid data returned by the structure can be filtered, effective perception of the laser sensor to the environment is guaranteed, and detection accuracy of the robot is improved.
It should be understood that what is described in this summary is not intended to limit the critical or essential features of the embodiments of the disclosure nor to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a flow chart of a laser sensor-based robotic structure detection method according to an embodiment of the disclosure;
fig. 2 shows a block diagram of a laser sensor-based robotic structure detecting device according to an embodiment of the disclosure;
fig. 3 illustrates a block diagram of an exemplary electronic device capable of implementing embodiments of the present disclosure.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present disclosure more apparent, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are some embodiments of the present disclosure, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments in this disclosure without inventive faculty, are intended to be within the scope of this disclosure.
In addition, the term "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In this disclosure, (the present solution and advantages may be summarized here in a short term) as much as possible without simple repetition with the claims.
Fig. 1 shows a flow chart of laser sensor based robot structure detection according to an embodiment of the present disclosure.
S110, receiving a laser data frame.
The robot is provided with a laser sensor.
Optionally, the laser sensor includes a rotatable emitting device, and the emitting device emits a laser beam after rotating for a certain angle, the emitted laser beam is reflected back after encountering an object, and the reflected light is received by an optical receiving system, so that the distance from the laser scanner to the object can be measured.
The laser data frame is received when the robot starts to perform a task (start-up), i.e. performs a rotation, displacement operation (movement).
Optionally, the laser data frame includes the first frame of laser data received after the robot is started and the laser data received after the robot is moved.
And S120, comparing the currently received laser data frame with the available laser data frame stored last time, and taking the currently received laser data frame as the available laser frame if the changed value is greater than the preset movement threshold value.
Optionally, setting an available laser frame container valid_vecPnt for accessing laser frame data when robot personnel change;
setting a structure body refer_Stru of reference laser data for comparison between the laser frame data;
setting a dynamic distance threshold P0 for defining whether the current laser data is a change value or not, and avoiding position change caused by the jump of the laser data, wherein the value range of P0 is generally 10-30cm, such as 15cm;
setting a movement threshold P1 for judging whether the current laser data meets a large-amplitude movement condition, wherein the value range of P1 is generally 1/3-1/2 of the current acquisition point number of the laser sensor, for example, the acquisition point number of the laser sensor is 3000, and the value of P1 can be 1000 or 1500;
optionally, storing the received first frame laser data into the valid_vecpnt and refer_stra respectively;
continuously receiving a laser data frame, comparing the currently received laser data frame with the laser data frame stored in the refer_stru, and if the distance difference in the same direction exceeds P0, judging the currently acquired laser data frame as a change frame, and acquiring the number of changed values of the change frame; that is, the spot position of the currently acquired laser data frame (in this embodiment, a frame of laser data has about 3000 spots) and the position of the spot in the laser data frame stored in the refer_stru are compared one by one, if the offset difference (misalignment) of the spot positions in the same direction exceeds P0, it is determined that the currently acquired laser data frame is a change frame (displacement change occurs), and the number of changed values, that is, the number of misaligned spots is obtained;
and if the number of the changed values (the number of the misaligned light spots) is larger than the P1, taking the currently acquired laser data frame as an available laser frame, storing the currently acquired laser data frame as the available laser frame into the valid_vecPnt, and replacing the currently acquired laser data frame as the available laser frame with the laser frame data in the refer_Stru.
And if the number of the changed values is smaller than or equal to P1, the mobile robot reacquires the laser data frame, and the valid_vecPnt and the refer_Stru do not change any value.
S130, acquiring available laser frames with preset acquisition numbers, and screening out robot body structure data according to variances of the available laser frames with the preset acquisition numbers.
And receiving the available laser frames and judging the chassis structure.
An acquisition available laser frame threshold P2 is set for defining the laser data acquisition number. The value range of P2 is generally a positive integer greater than or equal to 50, for example, 50, i.e., 50 frames are usually selected to meet the structural detection requirement.
A structure robstrou_Stru storing the robot body structure data is provided for characterizing the robot body structure.
A structure var_Stru storing laser data variance data is provided for storing variance data calculated by the available laser frames.
A variance threshold P3 is set for defining whether the numerical presentation is a robot body structure. The value range of P3 is generally 5-20cm. For example 10cm.
And setting a robot body structure limit threshold P4 for filtering out data exceeding the threshold. Generally, the radius of the robot body is not too large, so that one data is set to restrict the detected structure value. The value range of the P4 is generally 1-3m.
Optionally, step S120 is repeated until the number of available laser data frames acquired reaches said P2. That is, step 120 is repeated, when the number of available laser frames stored in the valid_vecpnt reaches P2, the data acquisition is stopped, and at this time, most of the laser frames (available laser frames) in the container are key frames with great changes, so that the data requirement of the laser structure judgment is satisfied.
Optionally, extracting all available laser frames with the same measuring direction from the valid_vecPnt respectively;
solving variances of all available laser frames with the same measuring direction, and storing the values of the variances into the var_Stru;
extracting variance data from the var_Stru, screening the variance value, and if the variance value is smaller than the P3, the variance value corresponds to the robot body structure data;
comparing the robot body structure data with the P4;
if the robot body structure data is smaller than the P4, setting the variance value corresponding to the robot body structure data to be 1, and storing the variance value into the robstrou_Stru;
and if the robot body structure data is greater than or equal to the P4, setting the variance value corresponding to the robot body structure data to be 0, and storing the variance value into the robstrou_Stru.
Optionally, in the robstro_stru, 1 represents the robot body structure, and 0 represents no structural obstruction in this direction.
S140, carrying out segmentation processing on the robot body structure data to determine a robot body structure.
When the laser spot strikes an object at the edge, the laser data is sometimes unstable and the variance is too large, namely, inaccurate detection is caused.
Therefore, in the present embodiment, the robot body structure data obtained in step S130 is segmented, and a certain area is extended at two ends of the segmented data to solve the problem.
Specifically, a structural body final_robstrou_stru for storing the structural data of the robot body is arranged for representing the structure of the robot body;
setting an expansion threshold P5 for filling the edge of the robot body structure; the P5 is determined according to the angular resolution of the robot configured laser sensor, typically extended by 3-5 °, for example 4 °. That is, the number of extended points can be obtained from the angular resolution (P5);
and carrying out segmentation processing on the data in the robstrou_Stru, and dividing the data with continuous 1 into one segment. And expanding the P5 numbers with the value of 1 outwards for the two ends of each section respectively. Finally, the extended data segments are merged together and saved into the final_robstrou_stru (a table of final robot body structures), determining the robot body structure.
It should be noted that, the range and/or the number of the threshold values (parameters) set in the present embodiment may be set according to the actual application scenario and/or the type of the robot.
According to the robot structure detection method based on the laser sensor, whether the robot moves in a large direction is detected through laser, if so, the currently received laser frames are judged to be available laser frames, a certain threshold number of available laser frames are received, variances are obtained, robot body structure data are screened out, and segmentation processing is conducted on the robot body structure data to determine a final robot structure. Invalid data returned by the structure can be filtered, effective perception of the laser sensor to the environment is guaranteed, and detection accuracy of the robot is improved.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present disclosure is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present disclosure. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all alternative embodiments, and that the acts and modules referred to are not necessarily required by the present disclosure.
The foregoing is a description of embodiments of the method, and the following further describes embodiments of the present disclosure through examples of apparatus.
Fig. 2 shows a block diagram of a laser sensor-based robotic structure detecting device 200 according to an embodiment of the disclosure. As shown in fig. 2, the apparatus 200 includes:
a receiving module 210, configured to receive a laser data frame;
the comparison module 220 is configured to compare a currently received laser data frame with a last stored available laser data frame, and if a changed value is greater than a preset movement threshold, take the currently received laser data frame as an available laser frame;
the screening module 230 is configured to obtain a preset collection number of available laser frames, and screen out robot body structure data according to a variance of the preset collection number of available laser frames;
the determining module 240 is configured to perform segment processing on the robot body structure data, and determine a robot body structure.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the described modules may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
Referring now to FIG. 3, there is shown a schematic diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the present application. The terminal device shown in fig. 3 is only an example, and should not impose any limitation on the functions and the scope of use of the embodiment of the present application.
As shown in fig. 3, the computer system includes a Central Processing Unit (CPU) 301 that can perform various appropriate actions and processes based on a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage section 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the system operation are also stored. The CPU 301, ROM 302, and RAM 303 are connected to each other through a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305: an input section 306 including a keyboard, a mouse, and the like; an output portion 307 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 308 including a hard disk or the like; and a communication section 309 including a network interface card such as a LAN card, a modem, or the like. The communication section 309 performs communication processing via a network such as the internet. The drive 310 is also connected to the I/O interface 305 on an as-needed basis. Removable media 311, such as magnetic disks, optical disks, magneto-optical disks, semiconductor memories, and the like, are installed on demand on drive 310 so that a computer program read therefrom is installed into storage section 308 on demand.
In particular, the processes described above with reference to flowcharts may be implemented as computer software programs, based on embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 309, and/or installed from the removable medium 311. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 301.
The computer readable medium according to the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the above. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a unit, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes an information measurement unit, a travel locus determination unit, a map determination unit, and a driving strategy generation unit. The names of these units do not constitute limitations on the unit itself in some cases, and for example, the information measurement unit may also be described as "a unit that measures state information of the host vehicle and surrounding scene information".
As another aspect, the present application also provides a nonvolatile computer storage medium, which may be a nonvolatile computer storage medium included in the apparatus described in the above embodiment; or may be a non-volatile computer storage medium, alone, that is not incorporated into the terminal. The above-described nonvolatile computer storage medium stores one or more programs that, when executed by an apparatus, cause the apparatus to: receiving a laser data frame; comparing the currently received laser data frame with the available laser data frame stored last time, and taking the currently received laser data frame as the available laser frame if the changed numerical value is greater than the preset movement threshold value; acquiring available laser frames of a preset acquisition number, and screening out robot body structure data according to the variance of the available laser frames of the preset acquisition number; and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept described above. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (5)

1. A robot structure detection method based on a laser sensor, comprising:
receiving a laser data frame;
comparing the currently received laser data frame with the available laser data frame stored last time, and if the changed value is greater than the preset movement threshold value, taking the currently received laser data frame as the available laser frame, wherein the method comprises the following steps:
storing the acquired first frame of laser data into a usable laser frame container and a structure body for referencing the laser data respectively;
continuously acquiring a laser data frame, comparing the currently acquired laser data frame with the laser data frame stored in the structure body of the reference laser data, and if the distance difference in the same direction exceeds a dynamic distance threshold value, judging the currently acquired laser data frame as a change frame, and acquiring the number of changed values of the change frame; if the number of the changed values is larger than a movement threshold value, taking the currently acquired laser data frame as an available laser frame;
storing the currently acquired laser data frame as an available laser frame into the available laser frame container, and replacing laser frame data in a structure of the reference laser data with the currently acquired laser data frame as an available laser frame;
if the number of the changed values is smaller than the movement threshold value, the mobile robot acquires the laser data frame again; acquiring available laser frames of a preset acquisition number, and screening out robot body structure data according to the variance of the available laser frames of the preset acquisition number, wherein the method comprises the following steps:
extracting available laser frames with the same measurement direction of a preset collection number from the available laser frame container;
solving the variance of the available laser frames with the same measuring direction of the preset acquisition number;
screening the variance value, and if the variance value is smaller than a variance threshold value, the variance value corresponds to the robot body structure data;
comparing the robot body structure data with a robot body structure limit threshold;
if the robot body structure data is smaller than the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the value into a structure body of the robot body structure data;
if the robot body structure data is larger than or equal to the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to be 0, and storing the value into a structure body of the robot body structure data;
and carrying out segmentation processing on the robot body structure data to determine the robot body structure.
2. The method of claim 1, wherein the segmenting the robot body structure data to determine a robot body structure comprises:
carrying out segmentation processing on the data in the structural body of the robot body structural data, and dividing the continuous 1 into one segment;
respectively expanding the two ends of each section outwards to obtain a preset expansion threshold value with a number of 1;
and merging the expanded data segments to determine the robot body structure.
3. A robot structural inspection device based on a laser sensor, comprising:
the receiving module is used for receiving the laser data frame;
the comparison module is configured to compare a currently received laser data frame with a last stored available laser data frame, and if a changed value is greater than a preset movement threshold, take the currently received laser data frame as the available laser frame, where the comparison module includes:
storing the acquired first frame of laser data into a usable laser frame container and a structure body for referencing the laser data respectively;
continuously acquiring a laser data frame, comparing the currently acquired laser data frame with the laser data frame stored in the structure body of the reference laser data, and if the distance difference in the same direction exceeds a dynamic distance threshold value, judging the currently acquired laser data frame as a change frame, and acquiring the number of changed values of the change frame; if the number of the changed values is larger than a movement threshold value, taking the currently acquired laser data frame as an available laser frame;
storing the currently acquired laser data frame as an available laser frame into the available laser frame container, and replacing laser frame data in a structure of the reference laser data with the currently acquired laser data frame as an available laser frame;
if the number of the changed values is smaller than the movement threshold value, the mobile robot acquires the laser data frame again;
the screening module is used for acquiring the available laser frames of the preset acquisition number, screening out the robot body structure data according to the variance of the available laser frames of the preset acquisition number, and comprises the following steps:
extracting available laser frames with the same measurement direction of a preset collection number from the available laser frame container;
solving the variance of the available laser frames with the same measuring direction of the preset acquisition number;
screening the variance value, and if the variance value is smaller than a variance threshold value, the variance value corresponds to the robot body structure data;
comparing the robot body structure data with a robot body structure limit threshold;
if the robot body structure data is smaller than the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to be 1, and storing the value into a structure body of the robot body structure data;
if the robot body structure data is larger than or equal to the robot body structure limiting threshold, setting the value of the variance corresponding to the robot body structure data to be 0, and storing the value into a structure body of the robot body structure data;
and the determining module is used for carrying out segmentation processing on the robot body structure data to determine the robot body structure.
4. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, characterized in that the processor, when executing the program, implements the method according to any of claims 1-2.
5. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any one of claims 1-2.
CN202011080558.7A 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor Active CN112162294B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011080558.7A CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011080558.7A CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Publications (2)

Publication Number Publication Date
CN112162294A CN112162294A (en) 2021-01-01
CN112162294B true CN112162294B (en) 2023-12-15

Family

ID=73868018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011080558.7A Active CN112162294B (en) 2020-10-10 2020-10-10 Robot structure detection method based on laser sensor

Country Status (1)

Country Link
CN (1) CN112162294B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112986912B (en) * 2021-03-19 2023-06-16 北京小狗吸尘器集团股份有限公司 Floor sweeper repositioning method and device based on structured light sensor and electronic equipment
CN113006634B (en) * 2021-03-30 2022-11-04 北京布科思科技有限公司 Automatic door opening and closing method of robot based on door identification

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6308114B1 (en) * 1999-04-20 2001-10-23 In-Kwang Kim Robot apparatus for detecting direction of sound source to move to sound source and method for operating the same
CN102449568A (en) * 2009-06-01 2012-05-09 株式会社日立制作所 Robot control system, robot control terminal, robot control method and program
KR20130099667A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Device and method for estimating location of mobile robot using raiser scanner and structure
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
CN106767827A (en) * 2016-12-29 2017-05-31 浙江大学 A kind of mobile robot point cloud map creating method based on laser data
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
CN107817509A (en) * 2017-09-07 2018-03-20 上海电力学院 Crusing robot navigation system and method based on the RTK Big Dippeves and laser radar
CN107907124A (en) * 2017-09-30 2018-04-13 杭州迦智科技有限公司 The localization method known again based on scene, electronic equipment, storage medium, system
CN108427097A (en) * 2018-01-26 2018-08-21 深圳班翟机器人有限公司 Linear array sensor devices laser positioning method, apparatus, terminal and computer readable storage medium
CN109459039A (en) * 2019-01-08 2019-03-12 湖南大学 A kind of the laser positioning navigation system and its method of medicine transfer robot
CN109459777A (en) * 2018-11-21 2019-03-12 北京木业邦科技有限公司 A kind of robot, robot localization method and its storage medium
CN109614459A (en) * 2019-03-06 2019-04-12 上海思岚科技有限公司 Map structuring winding detection method and equipment applied to two-dimensional laser
CN209086755U (en) * 2018-12-11 2019-07-09 上海智臻智能网络科技股份有限公司 A kind of robot and its control system
WO2019140745A1 (en) * 2018-01-16 2019-07-25 广东省智能制造研究所 Robot positioning method and device
CN110071582A (en) * 2019-05-14 2019-07-30 北京国网富达科技发展有限责任公司 Charging mechanism and its method for transmission line polling robot
CN110196044A (en) * 2019-05-28 2019-09-03 广东亿嘉和科技有限公司 It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method
CN110340935A (en) * 2018-04-03 2019-10-18 深圳市神州云海智能科技有限公司 A kind of method and robot of robot fusion positioning
CN110839206A (en) * 2019-11-26 2020-02-25 北京布科思科技有限公司 Positioning method and device based on dual-mode tag
CN111220967A (en) * 2020-01-02 2020-06-02 小狗电器互联网科技(北京)股份有限公司 Method and device for detecting data validity of laser radar
CN111555817A (en) * 2020-05-09 2020-08-18 国网江苏省电力有限公司无锡供电分公司 Differential modulation safety optical communication method and device based on coherent optical system
CN111736524A (en) * 2020-07-17 2020-10-02 北京布科思科技有限公司 Multi-robot scheduling method, device and equipment based on time and space

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004095071A2 (en) * 2003-04-17 2004-11-04 Kenneth Sinclair Object detection system
KR100594165B1 (en) * 2005-05-24 2006-06-28 삼성전자주식회사 Robot controlling system based on network and method for controlling velocity of robot in the robot controlling system
KR100735565B1 (en) * 2006-05-17 2007-07-04 삼성전자주식회사 Method for detecting an object using structured light and robot using the same
EP2776216B1 (en) * 2011-11-11 2022-08-31 iRobot Corporation Robot apparautus and control method for resuming operation following a pause.
EP3082543B1 (en) * 2013-12-18 2019-01-09 iRobot Corporation Autonomous mobile robot
JP2020519881A (en) * 2017-05-08 2020-07-02 ベロダイン ライダー, インク. LIDAR data collection and control
JP6705976B2 (en) * 2017-05-11 2020-06-03 株式会社安川電機 Robot, robot control method, workpiece manufacturing method
JP7135496B2 (en) * 2018-06-26 2022-09-13 セイコーエプソン株式会社 3D measuring device, control device and robot system
US20200256999A1 (en) * 2019-02-07 2020-08-13 Atulya Yellepeddi Lidar techniques for autonomous vehicles

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6308114B1 (en) * 1999-04-20 2001-10-23 In-Kwang Kim Robot apparatus for detecting direction of sound source to move to sound source and method for operating the same
US9002511B1 (en) * 2005-10-21 2015-04-07 Irobot Corporation Methods and systems for obstacle detection using structured light
CN102449568A (en) * 2009-06-01 2012-05-09 株式会社日立制作所 Robot control system, robot control terminal, robot control method and program
KR20130099667A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Device and method for estimating location of mobile robot using raiser scanner and structure
CN106767827A (en) * 2016-12-29 2017-05-31 浙江大学 A kind of mobile robot point cloud map creating method based on laser data
US9870624B1 (en) * 2017-01-13 2018-01-16 Otsaw Digital Pte. Ltd. Three-dimensional mapping of an environment
CN107817509A (en) * 2017-09-07 2018-03-20 上海电力学院 Crusing robot navigation system and method based on the RTK Big Dippeves and laser radar
CN107907124A (en) * 2017-09-30 2018-04-13 杭州迦智科技有限公司 The localization method known again based on scene, electronic equipment, storage medium, system
WO2019140745A1 (en) * 2018-01-16 2019-07-25 广东省智能制造研究所 Robot positioning method and device
CN108427097A (en) * 2018-01-26 2018-08-21 深圳班翟机器人有限公司 Linear array sensor devices laser positioning method, apparatus, terminal and computer readable storage medium
CN110340935A (en) * 2018-04-03 2019-10-18 深圳市神州云海智能科技有限公司 A kind of method and robot of robot fusion positioning
CN109459777A (en) * 2018-11-21 2019-03-12 北京木业邦科技有限公司 A kind of robot, robot localization method and its storage medium
CN209086755U (en) * 2018-12-11 2019-07-09 上海智臻智能网络科技股份有限公司 A kind of robot and its control system
CN109459039A (en) * 2019-01-08 2019-03-12 湖南大学 A kind of the laser positioning navigation system and its method of medicine transfer robot
CN109614459A (en) * 2019-03-06 2019-04-12 上海思岚科技有限公司 Map structuring winding detection method and equipment applied to two-dimensional laser
CN110071582A (en) * 2019-05-14 2019-07-30 北京国网富达科技发展有限责任公司 Charging mechanism and its method for transmission line polling robot
CN110196044A (en) * 2019-05-28 2019-09-03 广东亿嘉和科技有限公司 It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method
CN110839206A (en) * 2019-11-26 2020-02-25 北京布科思科技有限公司 Positioning method and device based on dual-mode tag
CN111220967A (en) * 2020-01-02 2020-06-02 小狗电器互联网科技(北京)股份有限公司 Method and device for detecting data validity of laser radar
CN111555817A (en) * 2020-05-09 2020-08-18 国网江苏省电力有限公司无锡供电分公司 Differential modulation safety optical communication method and device based on coherent optical system
CN111736524A (en) * 2020-07-17 2020-10-02 北京布科思科技有限公司 Multi-robot scheduling method, device and equipment based on time and space

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于动态阈值分段的激光扫描数据过滤算法;李扬;苏海;余松森;曹一波;刘好新;;激光与红外(第08期);全文 *

Also Published As

Publication number Publication date
CN112162294A (en) 2021-01-01

Similar Documents

Publication Publication Date Title
US20210190513A1 (en) Navigation map updating method and apparatus and robot using the same
US20200082183A1 (en) Method for position detection, device, and storage medium
CN109829351B (en) Method and device for detecting lane information and computer readable storage medium
CN109212530B (en) Method and apparatus for determining velocity of obstacle
US20200081105A1 (en) Method and device for detecting precision of internal parameter of laser radar
CN108983213B (en) Method, device and equipment for determining static state of obstacle and storage medium
CN112162294B (en) Robot structure detection method based on laser sensor
CN109188438A (en) Yaw angle determines method, apparatus, equipment and medium
US11532166B2 (en) Obstacle positioning method, device and terminal
CN113432533B (en) Robot positioning method and device, robot and storage medium
CN109284801B (en) Traffic indicator lamp state identification method and device, electronic equipment and storage medium
EP3901821A1 (en) Method and device for calibrating pitch of camera on vehicle and method and device for continual learning of vanishing point estimation model to be used for calibrating the pitch
CN115880252B (en) Container sling detection method, device, computer equipment and storage medium
CN114662600B (en) Lane line detection method, device and storage medium
CN110426714B (en) Obstacle identification method
CN113776520B (en) Map construction, using method, device, robot and medium
CN111259102A (en) Map updating method, electronic device and storage medium
US20240161517A1 (en) Detection method and system for a mobile object
CN115661230A (en) Estimation method for warehouse material volume
CN112149441B (en) Two-dimensional code positioning control method based on reflecting plate
JP5176523B2 (en) Moving body detection apparatus, moving body detection method, and moving body detection program
CN113050103A (en) Ground detection method, device, electronic equipment, system and medium
CN110068834B (en) Road edge detection method and device
CN110363834B (en) Point cloud data segmentation method and device
CN112197763A (en) Map construction method, map construction device, map construction equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240731

Address after: 0079, 4th Floor, Building A, Building 24, No. 68 Beiqing Road, Haidian District, Beijing, 100094

Patentee after: BEIJING BOOCAX TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: 100192 606, Jianjin center, a 1, Yongtai garden, Qinghe, Haidian District, Beijing

Patentee before: BEIJING BOOCAX TECHNOLOGY Co.,Ltd.

Country or region before: China

Patentee before: Shandong Bucos Robot Co.,Ltd.

Patentee before: SHENZHEN BOOCAX TECHNOLOGY CO.,LTD.

TR01 Transfer of patent right