AU2021102808A4 - A LiDAR point classification system for visualizing and processing LiDAR data and its method there of - Google Patents

A LiDAR point classification system for visualizing and processing LiDAR data and its method there of Download PDF

Info

Publication number
AU2021102808A4
AU2021102808A4 AU2021102808A AU2021102808A AU2021102808A4 AU 2021102808 A4 AU2021102808 A4 AU 2021102808A4 AU 2021102808 A AU2021102808 A AU 2021102808A AU 2021102808 A AU2021102808 A AU 2021102808A AU 2021102808 A4 AU2021102808 A4 AU 2021102808A4
Authority
AU
Australia
Prior art keywords
lidar
data
height
radians
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2021102808A
Inventor
Abhijit Boruah
Shyamal Hazarika
Arpan Phukan
Pronam Phukan
Rohit Sinha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2021102808A priority Critical patent/AU2021102808A4/en
Application granted granted Critical
Publication of AU2021102808A4 publication Critical patent/AU2021102808A4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to a LiDAR point system for visualizing and processing LiDAR data. The system comprises of aLiDAR, at least two servos, a control unit, and a 2D map. The objective of the present disclosure is to develop a flexible method for visualizing and processing LiDAR data this is simple and cost-effective and yet achieves a similar degree of functionality as that of its high end peers. This is achieved by developing a LiDAR point classification framework. In this disclosure a fairly primitive model, LiDAR Lite V2 (Blue Lable) is used, which cost about a third of other sophisticated LiDAR systems. The LiDAR is used to gather classified data points and to represent the distance and height of obstacles, and the gaps between them. 15 100 A LIDAR unit 102 Two servos 104 Control unit 106 Two dimensional map 108 Figure 1 200 establishingaconnectionbetweenacontrolunitviaaBluetoothmodulethroughauserinterface engagedwithacomputingdeviceforreceivingiderdata analyzingandclassifyingrelevantverticesandthereafterdisplayingoutputona2Dscatter plotona2Dmap 204 concatenatingdistanceofapointofanobstaclesensedbyalidarwithpositionandheightvaluesobtainedby 2M positionvaluesoftwoservoswhichisfurthersenttosaidcomputingdeviceviasaidBIuetoothrnodule rece ivingstringcontainingdistance, positionand height attributes by said use rinte rface upon pairingsaid user 208 interface with said Bluetooth module, wherein said attributesareseparated bydelirniters calculatingpositionandheightattributesofsaidobstacleusingsin(radians),cossradians)andtan(radians) 210 functions analyzingandclassifyingsaidvaluesplottedinsaid2Dlscatterbyacustomapproachtodeterrnineobstaclesand gapsfor locortionof mobile agents in said environment Figure 2

Description

A LIDAR unit 102 Two servos 104
Control unit 106 Two dimensional map 108
Figure 1
200
establishingaconnectionbetweenacontrolunitviaaBluetoothmodulethroughauserinterface engagedwithacomputingdeviceforreceivingiderdata
analyzingandclassifyingrelevantverticesandthereafterdisplayingoutputona2Dscatter plotona2Dmap 204
concatenatingdistanceofapointofanobstaclesensedbyalidarwithpositionandheightvaluesobtainedby 2M positionvaluesoftwoservoswhichisfurthersenttosaidcomputingdeviceviasaidBIuetoothrnodule
rece ivingstringcontainingdistance, positionand height attributes by said use rinte rface upon pairingsaid user 208 interface with said Bluetooth module, wherein said attributesareseparated bydelirniters
calculatingpositionandheightattributesofsaidobstacleusingsin(radians),cossradians)andtan(radians) 210 functions
analyzingandclassifyingsaidvaluesplottedinsaid2Dlscatterbyacustomapproachtodeterrnineobstaclesand gapsfor locortionof mobile agents in said environment
Figure 2
A LiDAR point classification system for visualizing and processing LiDAR data and its method thereof
FIELD OF THE INVENTION
The present disclosure relates to a LiDAR point system and method for visualizing and processing LiDAR data.
BACKGROUND OF THE INVENTION
Over the years, Light Detection And Ranging (LiDAR) has seen growing use in remote sensing and imaging, thanks to its simplicity and relatively low cost. A typical LiDAR system uses light to measure variable distances by illuminating the target with laser and processing the reflected laser pulses with a sensor. A major area of interest for LiDAR is autonomous navigation. In the past few years, it has emerged as the leading technology in safety and autonomous systems. Aside from this, LiDAR datasets have been developed for applications in urban environments (buildings, bridges, highways, etc.), for mining and geological applications, emergency management (landslides, floodplain mapping, hurricane damage assessment, etc.), land cover change and global biogeochemical cycling (biomass, ecological impacts, etc.)
Although there are plenty of software packages available for processing and interpreting LiDAR data, modern obstacle detection for high end autonomous navigation packages are not accessible to most users due to cost or complexity. There is a need for a software package which is simple to implement, will classify obstacles and gaps, and can incorporate additional data when they are available. The identified gaps will be classified in real time into two classes: satisfactory and unsatisfactory. The satisfactory gaps are continuously analyzed and modified as required (classified as unsatisfactory) by incorporating new information about obstacles in the local environment. These satisfactory gaps will collectively form a path for the mobile agent to travel.
The fundamental working principle of a LiDAR systems is laser ranging. It emits ultraviolet, visible, or near infrared light from the sensor. An object within the laser footprint will generate a reflection, called a return. Differences in laser return times can then be used to calculate the distance of the target. Real world applications of LiDAR includes autonomous driving, landslide investigations, digital elevation modelling and flood modeling, forest planning and management, oil and gas exploration, deployment of solar panels, etc. LiDARs acquire quite accurate distance information with a high range resolution and angular resolution, which is usually used in the map generating technique of the autonomous vehicles. LiDAR can be used in moving vehicles for obstacle detection and collision avoidance. Researchers have been studying methods to classify obstacles by using distance data a to get outline of the obstacles and geometric information and subsequently use the information to classify the type of obstacles. There is already a method to classify obstacles using the LIDAR intensity dataThe method uses the probability distribution of the LIDAR intensity data and dispersion to perceive and classify obstacles. However, less complicated calculation does not generate accurate classification of obstacles based on the probability distribution of the intensity of data and dispersion. If there is a window of error in real-time response, quick planning of a path for autonomous vehicles becomes difficult as well as issues relating to safety and convenience arise. For this reason, a new obstacle classification method based on a single LiDAR is necessary, which should be simple and efficient.
In one existing solution, a light detection and ranging (LIDAR) apparatus includes dual beam scanners with dual beam steering. A first beam scanner in the LIDAR apparatus scans a wider area in one or more of a first plurality of scan patterns, and a second beam scanner in the LIDAR apparatus scans a narrower area in one or more of a second plurality of scan patterns different from the first plurality of scan patterns.
However, there are plenty of software packages available for processing and interpreting LiDAR data, modern obstacle detection for high end autonomous navigation packages are not accessible to most users due to cost or complexity. Therefore, in order to avoid the aforementioned drawbacks there is a need of a LiDAR point classification system for visualizing and processing LiDAR data.
SUMMARY OF THE INVENTION
The present disclosure relates to a LiDAR point classification system for visualizing and processing LiDAR data. The objective of the present disclosure is to develop a flexible method for visualizing and processing LiDAR data this is simple and cost-effective and yet achieves a similar degree of functionality as that of its high end peers. This is achieved by developing a LiDAR point classification framework. The classified data point from a LiDAR-Lite V2 (Blue Label) is being used to represent the distance and height of obstacles, and the gaps between them. This will enable an autonomous mobile agent to determine palatable paths through the satisfactory gaps.
The present disclosure seeks to provide a LiDAR point classification system for visualizing and processing LiDAR data. The system comprises: a LiDAR to sense distance of a point of an obstacle; at least two servos driven by a motor shield to mount said LiDAR for scanning environment;a control unit equipped with a Bluetooth module for receiving string containing values of distance, position and height attributes each separated by delimiters to calculate position and height attributes of said obstacle; anda two dimensional (2D) map to determine obstacles and gaps for locomotion ofmobile agents in said environment upon analyzing and classifying said values plotted in said 2D scatter bya custom approach.
The present disclosure also seeks to provide a LiDAR point classification method for visualizing and processing data. The method comprises: establishing a connection between a control unit via a Bluetooth module through a user interface engaged with a computing device for receiving lidar data; analyzing and classifying relevant vertices and thereafter displaying output on a 2D scatter plot on a 2D map; concatenating distance of a point of an obstacle sensed by a lidar with position and height values, obtained by position values of two servos which is further sent to said computing device via said Bluetooth module; receiving string containing distance, position and height attributes by said user interface upon pairing said user interface with said Bluetooth module, wherein said attributes are separated by delimiters; calculating position and height attributes of said obstacle using sin(radians), cos(radians) and tan(radians) functions; and analyzing and classifying said values plotted in said 2D scatter by the custom approach to determine obstacles and gaps for locomotion of mobile agents in said environment.
An objective of the present disclosure is to provide a LiDAR point system for visualizing and processing LiDAR data.
Another object of the present disclosure is to develop a LiDAR point classification approach.
Another object of the present disclosure is to develop a flexible method for visualizing and processing LiDAR data that is simple and cost effective.
Another object of the present disclosure is to use LiDAR Lite V2 (Blue Label) model.
Another object of the present disclosure is to enable an autonomous mobile agent to determine palatable paths through the satisfactory gaps.
To further clarify advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF FIGURES
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a block diagram of a LiDAR point classification system for visualizing and processing LiDAR data in accordance with an embodiment of the present disclosure;
Figure 2 illustrates a flow chart of a LiDAR point classification method for visualizing and processing data in accordance with an embodiment of the present disclosure;
Figure 3 illustrates the simulation of the 3D-harness which supports the LiDAR in accordance with an embodiment of the present disclosure;
Figure 4 illustrates the user interface of the sixth-sense app in accordance with an embodiment of the present disclosure;
Figure 5 illustrates an obstacle at different positions in accordance with an embodiment of the present disclosure;
Figure 6 illustrates an obstacle at different positions in accordance with an embodiment of the present disclosure;
Figure 7 illustrates a part of data sent and the map plotted by the received data in accordance with an embodiment of the present disclosure;
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.
DETAILED DESCRIPTION
For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to "an aspect", "another aspect" or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises...a" does not, without more constraints, preclude the existence of other devices or other sub systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates a block diagram of a LiDAR point classification system for visualizing and processing LiDAR data in accordance with an embodiment of the present disclosure. The system 100 includes a LiDAR unit 102 to sense distance of a point of an obstacle. The lidar senses the distance of a point of an obstacle which is then concatenated with the position and height values, obtained by the position values of the two servos, and then sent to the android device via the HC-05 Bluetooth module.
In an embodiment, at least two servos unit 104 driven by a motor shield to mount said LiDAR for scanning environment. Two Servos driven by an Adafruit L293D motor shield v1 are used to mount the LiDAR for scanning the environment. The Servos are setup in such a way so that Servo 2 scans the horizontal (X-axis) plane of the environment and Servo 1 is incremented by 5 degrees in positive direction of Y-axis to provide the height information of objects in the environment.
In an embodiment, a control unit 106 is equipped with a Bluetooth module for receiving string containing values of distance, position and height attributes each separated by delimiters to calculate position and height attributes of said obstacle.
In an embodiment, a two-dimensional (2D) map unit 108 is used to determine obstacles and gaps for locomotion of mobile agents in said environment upon analyzing and classifying said values plotted in said 2D scatter by the custom approach.
Figure 2 illustrates a flow chart of a LiDAR point classification method for visualizing and processing data in accordance with an embodiment of the present disclosure. At step 202 the method 200 includes, establishing a connection between a control unit via a Bluetooth module through a user interface engaged with a computing device for receiving lidar data.
At step 204 the method 200 includes analyzing and classifying relevant vertices and thereafter displaying output on a 2D scatter plot on a 2D map.
At step 206 the method 200 includes concatenating distance of a point of an obstacle sensed by a lidar with position and height values, obtained by position values of two servos which is further sent to said computing device via said Bluetooth module.
At step 208 the method 200 includes receiving string containing distance, position and height attributes by said user interface upon pairing said user interface with said Bluetooth module, wherein said attributes are separated by delimiters.
At step 210 the method 200 includes calculating position and height attributes of said obstacle using sin(radians), cos(radians) and tan(radians) functions.
At step 212 the method 200 includes analyzing and classifying said values plotted in said 2D scatter by the custom approach to determine obstacles and gaps for locomotion of mobile gents in said environment.
Figure 3 illustrates the simulation of the 3D-harness which supports the LiDAR in accordance with an embodiment of the present disclosure. In this figure the 2 servos are shown. The servos are setup in such a way so that Servo 2 scans the horizontal (X-axis) plane of the environment and Servo 1 is incremented by 5 degrees in positive direction of Y-axis to provide the height information of objects in the environment.
Figure 4 illustrates the user interface of the sixth-sense app in accordance with an embodiment of the present disclosure. To visualize the lidar dataset on a 2D scatter plot, an android app is developed which implements the android graph library. The app is built in Android studio and is supported on android version 4.4 (KitKat) or higher. This app satisfies the objective of establishing a connection between the prime control circuits (Arduino Uno) via a Bluetooth module, receiving the lidar data, analyzing and classifying the relevant vertices and finally displaying the output on 2D scatter plot.
Figure 5 illustrates an obstacle at different positions in accordance with an embodiment of the present disclosure. The figure (a) shows an obstacle at (50,50) which results in the formation of a satisfactory gap. The figure (b) shows an obstacle at height (2 units) between the earlier gap, which makes it unsatisfactory. The figure (c) shows an obstacle at height (2 units) and at position (0, 50) from the rover. The horizontal displacement of the obstacle from the gap is 50, which is <70. Hence the gap is affected and it becomes non-portable.
Figure 6 illustrates an obstacle at different positions in accordance with an embodiment of the present disclosure. The figure (a) shows an obstacle at height (2 units) and at position (-100, 100) from the rover. The horizontal displacement of the obstacle from the gap is 141 (approx), which is >70. Hence, the gap remains unaffected. The figure (b) shows an obstacle at height (2 units) and at position (0, 50) from the rover. The horizontal displacement of the obstacle from the gap is 50, which is <70. Hence, the gap is affected and part of it become non-palatable.
Figure 7 illustrates a part of data sent and the map plotted by the received data in accordance with an embodiment of the present disclosure. The figure (a) shows the data sent t by the Lidar to Android where'?',' and ':' are delimiters. The figure (b) shows the map plotted by the received data.
After implementation of the approach using the discussed platform, colour and shape information of the environment is achieved in the maps shown in this figure.
The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.

Claims (10)

WE CLAIM
1. A LiDAR point classification system for visualizing and processing LiDAR data, the system comprises:
a LiDAR to sense distance of a point of an obstacle;
at least two servos driven by a motor shield to mount said LiDAR for scanning environment;
a control unit equipped with a Bluetooth module for receiving string containing values of distance, position and height attributes each separated by delimiters to calculate position and height attributes of said obstacle; and
a two-dimensional (2D) map to determine obstacles and gaps for locomotion of mobile agents in said environment upon analyzing and classifying said values plotted in said 2D scatter by a custom approach.
2. The system as claimed in claim 1, wherein said two servos are setup in such a way so that a second servo scans horizontal (X-axis) plane of said environment and a first servo is incremented by 5 degrees in positive direction of Y-axis to provide height information of objects in said environment.
3. The system as claimed in claim 1, wherein said control unit separates said values and thereby proceed for calculating x, y and z co-ordinate values from position and height attributes using sin(radians), cos(radians) and tan(radians) functions.
4. The system as claimed in claim 1, wherein steps for creating a LiDAR data classifying system to identify obstacles and gaps for mobile agent to proceed in said environment comprises: establishing data transfer between said control unit and user interface via said Bluetooth module; implementing 2D interactive scatter plot in said computing device; establishing connection between said LiDAR and control unit and thereafter mounting said LiDAR on a harness with said servos to generate three-dimensional (3D) data set with distance, position and height attributes; generating Lidar data set and thereby classifying received data set in said computing device for color and shape encoding as class Labels; and plotting color encoded classified data set on a scatter plot in real time.
5. The system as claimed in claim 1, wherein said first servo rotates in upward direction up to45 degree whereas said second servo rotates in horizontal axis up to 180 degree.
6. The system as claimed in claim 5, wherein said first and second servo is interconnected through a harness.
7. A LiDAR point classification method for visualizing and processing LiDAR data, the method comprises:
establishing a connection between a control unit via a Bluetooth module through a user interface engaged with a computing device for receiving lidar data;
analyzing and classifying relevant vertices and thereafter displaying output on a 2D scatter plot on a 2D map;
concatenating distance of a point of an obstacle sensed by a lidar with position and height values, obtained by position values of two servos which is further sent to said computing device via said Bluetooth module; receiving string containing distance, position and height attributes by said user interface upon pairing said user interface with said Bluetooth module, wherein said attributes are separated by delimiters; calculating position and height attributes of said obstacle using sin(radians), cos(radians) and tan(radians) functions; and analyzing and classifying said values plotted in said 2D scatter by said custom approach to determine obstacles and gaps for locomotion of mobile agents in said environment.
8. The method as claimed in claim 7, wherein distance of co-ordinates point from start and end vertices of a gap is calculated if point of co ordinates lies over line segment.
9. The method as claimed in claim 8, wherein data points from said LiDAR is classified to represent distance and height of obstacles, and gaps between LiDAR and obstacle to enable an autonomous mobile agent to determine palatable paths through satisfactory gaps.
10. The method as claimed in claim 7, wherein said user interface takes continuous inputs from a hardware platform for 2D map generation.
AU2021102808A 2021-05-24 2021-05-24 A LiDAR point classification system for visualizing and processing LiDAR data and its method there of Active AU2021102808A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021102808A AU2021102808A4 (en) 2021-05-24 2021-05-24 A LiDAR point classification system for visualizing and processing LiDAR data and its method there of

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021102808A AU2021102808A4 (en) 2021-05-24 2021-05-24 A LiDAR point classification system for visualizing and processing LiDAR data and its method there of

Publications (1)

Publication Number Publication Date
AU2021102808A4 true AU2021102808A4 (en) 2022-03-17

Family

ID=80629180

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021102808A Active AU2021102808A4 (en) 2021-05-24 2021-05-24 A LiDAR point classification system for visualizing and processing LiDAR data and its method there of

Country Status (1)

Country Link
AU (1) AU2021102808A4 (en)

Similar Documents

Publication Publication Date Title
CN111108342B (en) Visual range method and pair alignment for high definition map creation
US10565458B2 (en) Simulation system, simulation program and simulation method
CN107784151B (en) Physical modeling of radar and ultrasonic sensors
US11094112B2 (en) Intelligent capturing of a dynamic physical environment
WO2019169348A1 (en) Visualization of high definition map data
CN111402387B (en) Removing short-time points from a point cloud for navigating a high-definition map of an autonomous vehicle
Benedek et al. Positioning and perception in LIDAR point clouds
CN111816020A (en) Migrating synthetic lidar data to a real domain for autonomous vehicle training
US20210117696A1 (en) Method and device for generating training data for a recognition model for recognizing objects in sensor data of a sensor, in particular, of a vehicle, method for training and method for activating
EP4375700A2 (en) Lidar scene generation for training machine learning models
AU2021102808A4 (en) A LiDAR point classification system for visualizing and processing LiDAR data and its method there of
Chen UAV patrol path planning based on machine vision and multi-sensor fusion
KR20230112296A (en) Implementation of a Mobile Target Search System with 3D SLAM and Object Localization in Indoor Environments
Haider et al. Modeling of Motion Distortion Effect of Scanning LiDAR Sensors for Simulation-based Testing
Karur et al. End-to-End Synthetic LiDAR Point Cloud Data Generation and Deep Learning Validation
McQuat Feature extraction workflows for urban mobile-terrestrial LIDAR data
Phukan et al. Information encoding, gap detection and analysis from 2D LiDAR data on android environment
Senel et al. Multi-Sensor Data Fusion for Real-Time Multi-Object Tracking. Processes 2023, 11, 501
Peral Garijo Development of a toolchain for the automated generation of digital maps for autonomous drive
Drouin et al. Active time-of-flight 3D imaging systems for medium-range applications
Ninan et al. Technology to Ensure Equitable Access to Automated Vehicles for Rural Areas
Mohammed In vehicle smartphone based position estimates on urban roads for lane departures using road level GIS information
Guerrero-Bañales et al. Use of LiDAR for Negative Obstacle Detection: A Thorough Review
Tran et al. A Novel of Improving the Accuracy of High-Definition Map for Autonomous Vehicle: A Realistic Case Study in Hanoi Area
Αποστόλου Study of simulation and modelling tools for autonomous vehicle driving based on stereoscopy

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)