WO2022014851A1 - Système lidar capable de définir une zone de détection - Google Patents

Système lidar capable de définir une zone de détection Download PDF

Info

Publication number
WO2022014851A1
WO2022014851A1 PCT/KR2021/006473 KR2021006473W WO2022014851A1 WO 2022014851 A1 WO2022014851 A1 WO 2022014851A1 KR 2021006473 W KR2021006473 W KR 2021006473W WO 2022014851 A1 WO2022014851 A1 WO 2022014851A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection area
distance
lidar sensor
specific
area
Prior art date
Application number
PCT/KR2021/006473
Other languages
English (en)
Korean (ko)
Inventor
윤재준
김수연
정태원
정종택
Original Assignee
(주)카네비컴
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)카네비컴 filed Critical (주)카네비컴
Publication of WO2022014851A1 publication Critical patent/WO2022014851A1/fr
Priority to US18/146,846 priority Critical patent/US20230134642A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/51Display arrangements

Definitions

  • the present invention relates to a lidar system in which a detection area can be freely set by a user.
  • the lidar sensor generally scans within a predetermined angular range, it is not possible to scan an area desired by the user.
  • An object of the present invention is to provide a lidar system capable of setting a detection area by a user and monitoring the set detection area.
  • the lidar sensor includes a detection area unit for setting a detection area within a maximum scan angle range and managing the set detection area; a light output unit outputting the light at a specific angle so that the light is output to a specific sensing area; and a sensing unit configured to sense an object in the specific sensing area using the reflected light of the output light.
  • the output light forms a trigger line
  • the detection unit acquires a measurement angle and a light arrival time of a sensed target with respect to the trigger line, and the minimum distance to the specific detection area based on the trigger line and The maximum distance is set.
  • a viewer device includes: a detection area designation unit for managing a detection area designated by a user; a communication unit for transmitting data expressed in Cartesian coordinates with respect to the designated detection area to a lidar sensor, and receiving information on a target detected within the detection area from the lidar sensor; and a monitoring unit configured to monitor the detection area based on the received information.
  • the lidar system according to the present invention can monitor the detection area designated by the user, and thus the user can utilize one lidar sensor for various places and uses.
  • FIG. 1 is a diagram illustrating a lidar system according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a region setting process.
  • FIG. 3 is a diagram illustrating viewer data according to an embodiment of the present invention.
  • 4 and 5 are diagrams illustrating examples of region setting.
  • FIG. 6 is a diagram illustrating a method for detecting an area using a distance.
  • FIG. 7 is a diagram illustrating an example of a buffer.
  • FIG. 8 is a diagram illustrating a process of setting a plurality of sensing regions.
  • FIG. 9 is a diagram illustrating a buffer setting process according to FIG. 8 .
  • FIG. 10 is a diagram illustrating trigger lines.
  • FIG. 11 is a diagram illustrating a distance conversion process according to an embodiment of the present invention.
  • FIG. 12 is a block diagram illustrating a lidar sensor according to an embodiment of the present invention.
  • FIG. 13 is a block diagram illustrating a viewer device according to an embodiment of the present invention.
  • the present invention relates to a lidar system capable of setting a detection area, and a user can monitor by setting a desired detection area within the scan angle range of the lidar sensor. Therefore, it is possible to detect by setting a detection area for various purposes in various places with one lidar sensor in which the scan angle range is set. That is, the utilization of the lidar sensor may increase.
  • the viewer device of the lidar system determines whether an object (eg, a person) is located in the detection area using the distance data measured by the lidar sensor, and the movement of the person including the amount of traffic back can be detected. Since it is determined based on the distance data, the lidar system can be implemented simply and efficiently.
  • an object eg, a person
  • FIG. 1 is a diagram illustrating a lidar system according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating a region setting process
  • FIG. 3 is a diagram illustrating viewer data according to an embodiment of the present invention
  • 4 and 5 are diagrams illustrating an example of region setting
  • FIG. 6 is a diagram illustrating a region sensing method using a distance
  • FIG. 7 is a diagram illustrating an example of a buffer.
  • 8 is a diagram illustrating a process of setting a plurality of detection areas
  • FIG. 9 is a diagram illustrating a buffer setting process according to FIG. 8
  • FIG. 10 is a diagram illustrating trigger lines.
  • 11 is a diagram illustrating a distance conversion process according to an embodiment of the present invention.
  • the lidar system of this embodiment may include a lidar sensor 100 and a viewer device 102 .
  • the lidar sensor 100 may scan a specific angular range by outputting light, for example, a laser, for example, may scan 130 degrees. However, even though the lidar sensor 100 can scan 130 degrees, if the user designates an area, only the designated area can be detected.
  • a laser for example
  • the lidar sensor 100 may be mounted on a vehicle as well as mounted on an intersection, a crosswalk or a screen door as shown in FIGS. 4 and 5 .
  • the lidar sensor 100 When the lidar sensor 100 is installed at the intersection as shown in FIG. 4 , it can scan the crosswalk area 400 to detect a person moving through the crosswalk, and as shown in FIG. 5 , the screen door When installed at the top of the , it can scan the floor area 500 to detect people getting on and off the subway.
  • the lidar sensor 100 may detect only a region within an angular range less than or equal to the maximum scan angle in order to detect movement of a person passing through a specific sensing region.
  • the maximum scan angle of the lidar sensor 100 is 130 degrees
  • the lidar sensor 100 may be installed at the intersection to detect the number of people moving at the crosswalk in the angle range of 40 degrees
  • the IDA sensor 100 may be installed on the screen door to detect the number of people getting on and off the subway in a 30 degree angle range. That is, one lidar sensor 100 can be used by setting various angle ranges in various places.
  • a plurality of lidar sensors 100 may be installed in one place to detect at least some overlapping detection areas.
  • the lidar sensor 100 receives a viewer signal including data for a detection area (user-specified detection area) specified by the user from the viewer device 102, and analyzes the received viewer signal to determine a user-specified detection area and scan the identified user-specified detection area.
  • a detection area user-specified detection area
  • the data for the user-specified detection area is set in a Cartesian coordinate system
  • the lidar sensor 100 may convert the data into a spherical coordinate system, and scan an area corresponding to the converted data.
  • the lidar sensor 100 may form trigger lines by outputting a laser at various angles in the maximum scan angle range as shown in FIG. 10 .
  • the distance from the lidar sensor 100 to the start point of the specific detection area (minimum distance) and from the lidar sensor 100 to the end point of the specific area based on the specific trigger line, as shown in FIG. A distance (maximum distance) of may be determined.
  • the lidar sensor 100 may be fixedly installed in a specific place.
  • the detection area division mark, the minimum distance, and the maximum distance may be dynamically stored in a specific address of the buffer.
  • the lidar sensor 100 Since the minimum distance and the maximum distance for the specific area are known, if the lidar sensor 100 knows only the distance to the scanned target, it can determine whether the target is inside or outside the specific area. For example, based on a specific trigger line, when the minimum distance is 3 and the maximum distance is 5, if the distance measured by the lidar sensor 100 is 4, the target is located inside the specific area, 6 If it is, it is located outside the specific area. However, this determination may be made by the viewer device 102 .
  • the minimum distance and the maximum distance may be different for each trigger line, it will be determined whether the target is located inside or outside a specific area for each trigger line.
  • the viewer device 102 may transmit a viewer signal having data (sensing area designation data) for the designated detection area to the lidar sensor 100 .
  • the user may designate the region by directly clicking the coordinates of the desired region, or may designate the region using a preset figure.
  • that the user sets the sensing area includes the concept of arbitrarily setting and shipping at the factory according to an order. That is, it is not limited to when and who sets the detection area, and furthermore, it is a use including the concept of setting it at any time at the site of use or designating it for a specific purpose in the factory.
  • a user may designate an area A having a rectangular shape as shown in FIG. 2 at the time of shipment from the field or factory, and this area may be expressed in a Cartesian coordinate system.
  • the transmitted data is the lidar sensor 100 .
  • the lidar sensor 100 independently performs a monitoring and detection operation for a user-specified area even if a viewer device is not connected, and displays an optical signal or an electrical signal including the detection result when an object is detected. It can be output through
  • the viewer device 102 transmits the detection area designation data expressed in the Cartesian coordinate system to the lidar sensor 100 , and the lidar sensor 100 transmits the detection area designation data transmitted from the viewer device 102 .
  • the lidar sensor 100 transmits the detection area designation data transmitted from the viewer device 102 .
  • the lidar sensor 100 provides the angle and signal arrival time measured based on a specific trigger line to the viewer device 102, and the viewer device 102 calculates the provided angle and signal arrival time.
  • the distance to the detected object can be calculated using the Subsequently, the viewer device 102 may compare the calculated distance with the minimum and maximum distances of the specific sensing area to determine whether the object is located within the specific sensing area.
  • the viewer device 102 receives distance data from the lidar sensor 100 to the object, and compares the received distance data with the minimum distance and maximum distance of the specific detection area to determine whether the object is It can be determined whether it is located inside a specific sensing area.
  • the viewer device 102 may receive information from the lidar sensor 100 indicating whether the target is within the detection area, and provide the received information to the user through a screen. .
  • the viewer device 102 may not only determine whether the object is located within a specific area, but also determine the moving direction of the object, the number of objects, and the like.
  • the detection area division mark and the minimum distance (Min) and the maximum distance (Max) based on the trigger line (1) may be stored at a specific address of the buffer.
  • the specific address may be dynamically allocated.
  • the viewer device 102 sequentially divides and displays the detection areas for each detection area for the same trigger line as shown in FIG. 9, the minimum distance and The maximum distance can be stored, ie the detection area data can be assigned.
  • a plurality of sensing regions may be dynamically allocated to an address of one buffer.
  • the name, minimum distance and maximum distance of detection area A and the name, minimum distance and maximum distance of detection area C are one address may be assigned sequentially.
  • the minimum distance of the detection area C will be greater than the maximum distance of the detection area A.
  • the maximum distance of the A detection area is 7
  • the minimum distance of the C detection area may be 11 .
  • a plurality of trigger lines may exist within the scan angle range of the lidar sensor 100 .
  • a minimum distance and a maximum distance for each detection area will be set.
  • the lidar sensor 100 converts the spherical coordinate system (r, ⁇ ) consisting of the distance (r) and the angle ( ⁇ ) detected in the designated detection area into Cartesian coordinates to calculate the distance, and calculates the distance between the calculated distance and the corresponding By comparing the minimum distance and the maximum distance of the trigger line, it may be determined whether a sensed object exists within the detection area.
  • the buffer allocation process of the detection area data the process of calculating the distance by converting the detected distance (r) and the angle ( ⁇ ) detected by the lidar sensor 100 into rectangular coordinates, and sensing through distance comparison
  • the process of determining whether an object exists in the area is performed by the lidar sensor 100
  • at least one of these may be performed by the viewer device 102 rather than the lidar sensor 100 . have.
  • the process of calculating the distance by converting the detected distance (r) and the angle ( ⁇ ) detected by the lidar sensor 100 into rectangular coordinates may be performed by the lidar sensor 100 .
  • the process of allocating a buffer of the sensing region data and determining whether an object exists in the sensing region through distance comparison may be performed by the viewer device 102 .
  • the viewer device 102 converts the distance data of the detection target transmitted from the lidar sensor 100 into point coordinates (x d , y d ) and uses it It can be printed on the screen.
  • the color of the corresponding point may be changed in the viewer device 102 , and the lidar sensor 100 may output a separate pulse to inform the user.
  • the viewer device 102 may provide the angular sequence, distance, and rectangular coordinates corresponding to the trigger line for the designated detection area to the lidar sensor 100 .
  • FIG. 12 is a block diagram illustrating a lidar sensor according to an embodiment of the present invention
  • FIG. 13 is a block diagram illustrating a viewer device according to an embodiment of the present invention.
  • the lidar sensor 100 of this embodiment includes a control unit 1200, a communication unit 1202, a light output unit 1204, a sensing area unit 1206, a sensing unit 1208, and a distance calculating unit ( 1210 , a determination unit 1212 , a detection confirmation unit 1214 , a buffer 1216 , and a storage unit may be included.
  • the communication unit 1202 is a communication connection path with the viewer device 102 .
  • the light output unit 1204 may output light to a sensing area designated by a user.
  • the light output unit 1204 may output light at a specific angle corresponding to the trigger line so that the trigger line passes through the detection area.
  • the detection area unit 1206 may manage the detection area designated by the user transmitted from the viewer device 102 through the communication unit 1202 , in particular, the minimum and maximum distances from the lidar sensor 100 to the detection area. can be set. However, the detection area is an area within the maximum scan angle range.
  • the sensing unit 1208 may detect an object in the sensing area using reflected light reflected from the object. According to an embodiment, the sensing unit 1208 may acquire a measurement angle of the target and a light arrival time (a time from light output to reflection and reception).
  • the distance calculator 1210 may calculate the distance from the lidar sensor 100 to the target by using the obtained measurement angle and the light arrival time.
  • the determination unit 1212 may determine whether the target exists within the specific detection area by comparing the calculated distance with the minimum distance and the maximum distance of the detection area.
  • the determination unit 1212 determines the calculated distance and the minimum and maximum distances for the first detection area. , by comparing a minimum distance and a maximum distance with respect to the second detection area, it is possible to determine whether the object exists in the first detection area and the second detection area.
  • the same trigger line may pass through the first sensing region and the second sensing region.
  • the detection confirmation unit 1214 may output a specific pulse or light when the target is detected within the detection area.
  • the buffer 1216 may store the minimum and maximum distances to the sensing area in dynamically allocated addresses.
  • the minimum distances and the maximum distances of the plurality of sensing areas corresponding to the same trigger line may be sequentially stored in the same address.
  • the storage unit may store various information such as data on the sensing area.
  • the controller 1200 controls overall operations of the components of the lidar sensor 100 .
  • the lidar sensor 100 determines whether or not the target exists within the detection area, but the lidar sensor 100 transmits distance data to the viewer device 102 and the viewer device 102 detects the target. It may be determined whether or not it exists within the sensing area. In this case, the lidar sensor 100 may not include the determination unit 1212 and the buffer 1216 .
  • the viewer device 102 includes a control unit 1300 , a communication unit 1302 , a detection area designation unit 1304 , a monitoring unit 1306 , a buffer 1308 , and a determination unit 1310 , a display 1312 , a detection confirmation unit 1314 , and a storage unit 1316 may be included.
  • the communication unit 1302 is a communication connection path with the lidar sensor 100 .
  • the communication unit 1302 transmits data expressed in Cartesian coordinates for the detection area designated by the user to the lidar sensor 100 , and transmits data from the lidar sensor 100 to a target detected within the detection area. information can be received.
  • the sensing area designation unit 1304 may manage a sensing area designated by a user.
  • the detection area designator 1304 may manage the minimum distance and the maximum distance of the designated detection area.
  • the user may set the sensing region by directly clicking the coordinates of the desired region or may set the sensing region using a figure.
  • the monitoring unit 1306 may provide a detection area sensed through the lidar sensor 100 to the user through the display unit 1312 . That is, the user may monitor the sensing area designated by the user through the monitoring unit 1306 .
  • the buffer 1308 may store the minimum and maximum distances to the sensing area at dynamically allocated addresses.
  • the minimum distances and the maximum distances of the plurality of sensing areas corresponding to the same trigger line may be sequentially stored in the same address.
  • the determination unit 1310 may determine whether the target exists within the specific detection area by comparing the distance to the target measured by the lidar sensor 100 and the minimum distance and the maximum distance of the detection area. have.
  • the determination unit 1310 determines the calculated distance and the minimum and maximum distances for the first sensing region. , by comparing a minimum distance and a maximum distance with respect to the second detection area, it is possible to determine whether the object exists in the first detection area and the second detection area.
  • the same trigger line may pass through the first sensing region and the second sensing region.
  • the detection confirmation unit 1314 may change a color of the detection area or a point displaying the object when the object is detected in the detection area.
  • the storage unit 1316 may store various information such as distance data.
  • the controller 1300 controls overall operations of the components of the viewer device 102 .
  • each component may be identified as each process.
  • the process of the above-described embodiment can be easily understood from the point of view of the components of the apparatus.
  • the technical contents described above may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the medium may be specially designed and configured for the embodiments or may be known and available to those skilled in the art of computer software.
  • Examples of the computer-readable recording medium include magnetic media such as hard disks, floppy disks and magnetic tapes, optical media such as CD-ROMs and DVDs, and magnetic such as floppy disks.
  • - includes magneto-optical media, and hardware devices specially configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • Examples of program instructions include not only machine language codes such as those generated by a compiler, but also high-level language codes that can be executed by a computer using an interpreter or the like.
  • a hardware device may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un système Lidar capable de surveiller une zone de détection définie par un utilisateur. Le capteur Lidar comprend: une unité de zone de détection pour gérer la zone de détection définie à l'intérieur de la plage d'angle de balayage maximale; une unité de sortie de lumière pour émettre de la lumière à un angle spécifique de telle sorte que la lumière est émise dans une zone de détection spécifique; et une unité de détection pour détecter un objet dans la zone de détection spécifique en utilisant la lumière réfléchie de la lumière émise, la lumière émise formant une ligne de déclenchement, tandis que l'unité de détection acquiert l'angle de mesure et le temps d'arrivée de lumière de l'objet détecté pour la ligne de déclenchement, et les distances minimales et maximales jusqu'à la zone de détection spécifique sont définies sur la base de la ligne de déclenchement.
PCT/KR2021/006473 2020-07-13 2021-05-25 Système lidar capable de définir une zone de détection WO2022014851A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/146,846 US20230134642A1 (en) 2020-07-13 2022-12-27 Lidar system capable of setting sensingn area

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200086185A KR102207549B1 (ko) 2020-07-13 2020-07-13 감지 영역 설정이 가능한 라이다 시스템
KR10-2020-0086185 2020-07-13

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/146,846 Continuation US20230134642A1 (en) 2020-07-13 2022-12-27 Lidar system capable of setting sensingn area

Publications (1)

Publication Number Publication Date
WO2022014851A1 true WO2022014851A1 (fr) 2022-01-20

Family

ID=74310357

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/006473 WO2022014851A1 (fr) 2020-07-13 2021-05-25 Système lidar capable de définir une zone de détection

Country Status (3)

Country Link
US (1) US20230134642A1 (fr)
KR (1) KR102207549B1 (fr)
WO (1) WO2022014851A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102207549B1 (ko) * 2020-07-13 2021-01-26 (주)카네비컴 감지 영역 설정이 가능한 라이다 시스템
KR102376701B1 (ko) * 2021-03-17 2022-03-21 주식회사 나노시스템즈 라이다 3차원 감지 영역 설정 방법
WO2024005607A1 (fr) * 2022-06-30 2024-01-04 주식회사 나노시스템즈 Système de reconnaissance d'objets dans une zone de détection, au moyen d'un lidar flash

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013238616A (ja) * 2006-01-30 2013-11-28 Fujitsu Ltd 目標検出装置およびシステム
KR101737803B1 (ko) * 2016-11-23 2017-05-19 렉스젠(주) 객체 정보 수집 장치 및 그 방법
KR101784611B1 (ko) * 2016-06-09 2017-11-06 재단법인대구경북과학기술원 라이다 센서 및 레이더 센서를 이용한 사람 탐지 장치 및 방법
KR20190030027A (ko) * 2017-09-13 2019-03-21 삼성전자주식회사 라이다 장치 및 이의 동작 방법
KR20200021911A (ko) * 2017-06-28 2020-03-02 비.이.에이. 일렉트로닉스 (베이징) 씨오., 엘티디. 전고형 플랫폼 스크린 도어용 안전 보호장치
KR102207549B1 (ko) * 2020-07-13 2021-01-26 (주)카네비컴 감지 영역 설정이 가능한 라이다 시스템

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102050678B1 (ko) 2018-05-14 2019-12-03 주식회사 에스오에스랩 라이다 장치

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013238616A (ja) * 2006-01-30 2013-11-28 Fujitsu Ltd 目標検出装置およびシステム
KR101784611B1 (ko) * 2016-06-09 2017-11-06 재단법인대구경북과학기술원 라이다 센서 및 레이더 센서를 이용한 사람 탐지 장치 및 방법
KR101737803B1 (ko) * 2016-11-23 2017-05-19 렉스젠(주) 객체 정보 수집 장치 및 그 방법
KR20200021911A (ko) * 2017-06-28 2020-03-02 비.이.에이. 일렉트로닉스 (베이징) 씨오., 엘티디. 전고형 플랫폼 스크린 도어용 안전 보호장치
KR20190030027A (ko) * 2017-09-13 2019-03-21 삼성전자주식회사 라이다 장치 및 이의 동작 방법
KR102207549B1 (ko) * 2020-07-13 2021-01-26 (주)카네비컴 감지 영역 설정이 가능한 라이다 시스템

Also Published As

Publication number Publication date
US20230134642A1 (en) 2023-05-04
KR102207549B1 (ko) 2021-01-26

Similar Documents

Publication Publication Date Title
WO2022014851A1 (fr) Système lidar capable de définir une zone de détection
WO2018052204A1 (fr) Robot d'aéroport et système le comprenant
EP1969436B1 (fr) Localisation de dispositif mobile
WO2012050305A2 (fr) Appareil et procédé permettant de procurer des informations se rapportant à des obstacles dans un véhicule mobile autonome
WO2022186411A1 (fr) Procédé et système de commande de maison intelligente basés sur l'emplacement
WO2019022304A1 (fr) Scanner lidar hybride
WO2022065563A1 (fr) Dispositif tactile d'ascenseur sans contact et son procédé de réglage
WO2021020866A1 (fr) Système et procédé d'analyse d'images pour surveillance à distance
WO2017111201A1 (fr) Appareil d'affichage d'image nocturne et son procédé de traitement d'image
WO2019172500A1 (fr) Dispositif de mesure de visibilité par analyse vidéo utilisant l'intelligence artificielle
WO2017007078A1 (fr) Télémètre laser
WO2021167312A1 (fr) Procédé et dispositif de reconnaissance tactile pourvu d'un capteur lidar
WO2023120818A1 (fr) Dispositif de régulation de flux de trafic pour réguler un flux de trafic dans lequel des véhicules autonomes sont mélangés, et procédé l'utilisant
WO2020184776A1 (fr) Procédé de réglage de trajet de déplacement et de reconnaissance d'emplacement utilisant une reconnaissance de code, une mobilité sans pilote et un système d'exploitation
WO2021221334A1 (fr) Dispositif de génération de palette de couleurs formée sur la base d'informations gps et de signal lidar, et son procédé de commande
WO2016072610A1 (fr) Procédé de reconnaissance et dispositif de reconnaissance
WO2013022159A1 (fr) Appareil de reconnaissance de voie de circulation et procédé associé
WO2014129825A1 (fr) Circuit de sélection de coordonnée et procédé d'un système de détection de toucher différentielle
WO2013151243A1 (fr) Procédé de synchronisation entre un terminal mobile et un dispositif d'exposition utilisant un appareil de reconnaissance du mouvement
WO2020159246A2 (fr) Dispositif et procédé de mise en œuvre de réalité virtuelle de commande à distance d'équipement au moyen d'un procédé de réalité augmentée, et système de gestion l'utilisant
WO2022139022A1 (fr) Système de gestion de navigation de sécurité de navire basé sur la réalité augmentée utilisant un appareil de prise de vues omnidirectionnel
WO2021182728A1 (fr) Système lidar correspondant à une intelligence artificielle
WO2019216673A1 (fr) Système et procédé de guidage d'objet pour corps mobile sans pilote
WO2019009474A1 (fr) Dispositif de détection laser à camera intégrée et son procédé de fonctionnement
KR100791262B1 (ko) 캔 통신을 이용한 주차 유도 시스템 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21842069

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21842069

Country of ref document: EP

Kind code of ref document: A1