WO2023027419A1 - 센싱 장치의 설치 이상을 감지하는 방법 및 이를 수행하는 센싱 장치 - Google Patents
센싱 장치의 설치 이상을 감지하는 방법 및 이를 수행하는 센싱 장치 Download PDFInfo
- Publication number
- WO2023027419A1 WO2023027419A1 PCT/KR2022/012338 KR2022012338W WO2023027419A1 WO 2023027419 A1 WO2023027419 A1 WO 2023027419A1 KR 2022012338 W KR2022012338 W KR 2022012338W WO 2023027419 A1 WO2023027419 A1 WO 2023027419A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- static
- sensing device
- point cloud
- point
- space
- Prior art date
Links
- 238000009434 installation Methods 0.000 title claims abstract description 62
- 230000005856 abnormality Effects 0.000 title claims abstract description 35
- 238000000034 method Methods 0.000 title claims description 28
- 230000003068 static effect Effects 0.000 claims abstract description 185
- 230000002159 abnormal effect Effects 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 10
- 238000004891 communication Methods 0.000 description 23
- 238000010586 diagram Methods 0.000 description 13
- 238000005516 engineering process Methods 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004883 computer application Methods 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000013145 classification model Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000032683 aging Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30236—Traffic on road, railway or crossing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- It relates to a method for detecting an installation abnormality of a sensing device and a sensing device for performing the same.
- LiDAR Light Detection And Ranging
- LIDAR is a technology for emitting light to an object in a 3D space and then receiving the reflected light to acquire information about the 3D space. LIDAR cannot perceive color like a camera image sensor, but it can detect distant objects and has excellent spatial resolution.
- the present invention relates to a method for detecting an abnormal installation of a sensing device based on a change in a point group obtained using a lidar sensor without a separate additional device when a change occurs in the installation state of the sensing device, and to a sensing device that performs the same. .
- a method for detecting an abnormal installation of a sensing device includes obtaining a point cloud for each time in a 3-dimensional space using a lidar sensor, and based on the obtained point cloud for each time for the 3-dimensional space, Determining a static object region in a 3D space, and determining an installation abnormality of a sensing device based on static point clouds having a predetermined time difference among static point clouds by time corresponding to the determined static object region.
- a computer-readable storage medium includes instructions for obtaining a point cloud for each time of a 3-dimensional space using a lidar sensor, and based on the obtained point cloud for each time of the 3-dimensional space, the 3-dimensional space Executed on a computer, including instructions for determining a static object area of , and instructions for determining an abnormality in the installation of the sensing device based on a static point cloud having a predetermined time difference among static point clouds for each time period corresponding to the determined static object area. Save the program to do it.
- a sensing device may include a sensor unit for obtaining a point cloud for each time in a 3D space using a lidar sensor, a memory for storing one or more instructions, and executing the one or more instructions, thereby obtaining the three points.
- a static object area of the 3D space is determined based on a point cloud by time in a dimensional space, and a sensing device is installed based on a static point cloud having a predetermined time difference among static point clouds by time corresponding to the determined static object area. It includes a processor that determines an anomaly.
- FIG. 1 is a view for explaining a state in which a sensing device is installed and a state in which a change occurs in an installation state of the sensing device.
- FIG. 2 is a diagram for explaining the configuration and operation of a sensing device according to an exemplary embodiment.
- FIG. 3 is a diagram for explaining a dynamic point cloud, a static point cloud, and a static object region in a spatial information map generated by a sensing device.
- FIG. 4 is a diagram for explaining a voxel map and voxels corresponding to unit regions of the voxel map.
- FIG. 5 is a diagram for explaining a process of distinguishing static voxels from dynamic voxels.
- FIG. 6 is a diagram for explaining a process of determining an installation abnormality of a sensing device.
- FIG. 7 is a diagram for explaining configuration and operation of a server according to an exemplary embodiment.
- FIG. 8 is a flowchart illustrating a method of detecting an installation abnormality of a sensing device according to an exemplary embodiment.
- FIG. 9 is a detailed flowchart illustrating a process of determining an installation abnormality of a sensing device.
- the present embodiments relate to a method for detecting an abnormality in the installation of a sensing device and a sensing device for performing the same, and detailed descriptions of matters widely known to those skilled in the art to which the following embodiments belong are omitted. do.
- FIG. 1 is a view for explaining a state in which a sensing device 100 is installed and a state in which a change occurs in an installation state of the sensing device 100 .
- the sensing device 100 is a device capable of obtaining point cloud data as spatial information about a 3D space, and may include at least one sensor.
- the sensing device 100 may emit light in the 3D space and obtain a point cloud for the 3D space based on light received in response thereto.
- the sensing device 100 may be installed indoors or outdoors where it can sense a space corresponding to a region of interest or as wide a space as possible. For example, as shown in FIG. 1 , the sensing device 100 may be installed by being fixed to a structure.
- the sensing device 100 may include a LIDAR (Light Detection and Ranging) sensor as a 3D sensor for sensing a 3D space, and may obtain volumetric point cloud data.
- the sensing device 100 may further include various types of sensors such as a radar sensor, an infrared image sensor, and a camera, as needed.
- the sensing device 100 may use a plurality of sensors of the same type or a combination of sensors of the same type in consideration of the sensing range of each type of sensor or the type of data that can be obtained.
- LiDAR sensors cannot detect the color of objects in a 3D space, but can detect their shape, size, and position.
- a multi-channel lidar sensor that can collect information about 3D space is suitable for applications that cannot detect the color or texture of an object, but can utilize the approximate shape, size, and volume of an object.
- a physical impact is applied to the sensing device 100 installed in the structure, or the sensing device 100 cannot be supported due to aging of the support or connection portion of the structure in which the sensing device 100 is installed. For this reason, a change may occur in the installation state of the sensing device 100 .
- a point cloud of the 3D space acquired by the sensing device 100 is changed.
- the sensing device ( 100) the equipment may be erroneously controlled or services may not be provided.
- an inertial sensor or the like may be additionally mounted.
- a device such as an inertial sensor on the sensing device 100, or when the installation state changes little by little over a long period of time, an abnormality in the installation of the sensing device 100 is detected. It can be difficult to do.
- FIG. 2 is a diagram for explaining the configuration and operation of the sensing device 100 according to an exemplary embodiment.
- a sensing device 100 may include a memory 110 , a processor 120 , a sensor unit 130 , and a communication interface 140 .
- a processor 120 may control the processing unit 120 .
- a sensor unit 130 may receive a signal from a sensor from a sensor.
- a communication interface 140 may be used to communicate with a sensor.
- Those skilled in the art related to the present embodiment may know that other general-purpose components may be further included in addition to the components shown in FIG. 2 .
- Memory 110 may store software and/or programs.
- the memory 110 may store instructions executable by the processor 120 .
- the processor 120 may access and use data stored in the memory 110 or store new data in the memory 110 .
- the processor 120 may execute instructions stored in the memory 110 .
- the processor 120 may execute a computer program installed in the sensing device 100 .
- the processor 120 may store and execute a computer program or application received from the outside in the memory 110 .
- the processor 120 may execute at least one processing module to perform a predetermined operation.
- the processor 120 may execute or control a processing module that executes a program for detecting an installation abnormality of the sensing device 100 .
- the processor 120 may control other elements included in the sensing device 100 to perform an operation corresponding to an execution result of a command or a computer program.
- the sensor unit 130 may include at least one sensor for sensing a 3D space.
- the sensor unit 130 may include a light emitter that emits light in a 3D space and a light receiver that receives light, and further includes a dedicated processor for obtaining a point cloud for the 3D space based on the intensity of light received by the light receiver. may also include The sensor unit 130 may obtain a point cloud for each time in the 3D space in order to track an object located in the 3D space within the sensing range.
- the sensor unit 130 may be a LIDAR (Light Detection and Ranging) sensor, and may obtain data for a space within a predetermined range including at least one 3D LIDAR sensor.
- the sensor unit 130 may further include various types of sensors such as a radar sensor, an infrared image sensor, and an ultrasonic sensor.
- the sensor unit 130 further includes at least one of a radar sensor and an ultrasonic sensor to obtain data on a blind area that cannot be detected by a lidar sensor or a nearby space within a predetermined distance from the sensing device 100. can do.
- the communication interface 140 may perform wired/wireless communication with other devices or networks.
- the communication interface 140 may include a communication module supporting at least one of various wired/wireless communication methods.
- a communication module that performs short-range communication such as RFID (Radio Frequency Identification), NFC (Near Field Communication), Bluetooth, various types of wireless communication, or wired communication using a coaxial cable or optical cable may be included.
- the communication interface 140 may be connected to a device located outside the sensing device 100 to transmit/receive signals or data.
- the sensing device 100 communicates with the manager's terminal device through the communication interface 140, or uses the time-specific point cloud for the 3D space obtained from the sensing device 100 to control devices or provide certain services. can be connected to an external server that provides
- An example of the processor 120 executes one or more instructions stored in the memory 110 to obtain a point cloud for a 3D space using a lidar sensor, and to obtain a point cloud for a 3D space. Based on this, it is possible to identify an object in a 3-dimensional space.
- the processor 120 applies a point cloud for a 3-dimensional space to the object classification model or clusters the point cloud for a 3-dimensional space to determine the ground or building It can also identify static objects like animals or dynamic objects like animals.
- the processor 120 may obtain a point cloud for each time of the 3D space by using a LIDAR sensor.
- the processor 120 may detect an installation abnormality of the sensing device 100 based on the obtained point cloud for each time in the 3D space, in a manner described below.
- the processor 120 may determine a static object region of the 3D space based on the obtained point cloud for each time of the 3D space by executing one or more instructions stored in the memory 110 .
- the processor 120 may generate a spatial information map of the 3D space by detecting an object in the 3D space based on the acquired point cloud of the 3D space. In this case, the processor 120 may classify objects in the 3D space into static objects and dynamic objects.
- a spatial information map generated based on a point cloud in time for a 3D space a dynamic object is detected and then disappears, or moves in a location within the spatial information map, while a static object is continuously located at the same location in the spatial information map. is detected
- FIG. 3 is a diagram for explaining a dynamic point cloud, a static point cloud, and a static object region in a spatial information map generated by the sensing device 100.
- a spatial information map generated based on a time-specific point cloud for a 3D space acquired using a lidar sensor is displayed.
- the spatial information map of FIG. 3 is acquired by installing the sensing device 100 at a location where a vehicle moving intersection can be observed, and the sensing device 100 uses a lidar sensor, and the traffic at the intersection, which is an area of interest, is obtained. It can be used to monitor information.
- the spatial information map of FIG. 3 it is possible to check point clouds corresponding to vehicles and pedestrians moving and stopping according to the traffic signal at the crossroads. Since vehicles and pedestrians move, they correspond to dynamic objects, and dynamic point clouds corresponding to such dynamic objects change their positions in the spatial information map and are not continuously detected in the same area for more than a certain period of time. On the other hand, traffic infrastructure, roads, buildings, etc. at intersections are static objects because they do not move, and static point clouds corresponding to such static objects can be continuously detected over a certain period of time in the same area of the spatial information map.
- the corresponding point cloud When a point cloud is continuously detected in a specific area corresponding to a specific coordinate for a certain period of time or more, the corresponding point cloud may be classified as a static point cloud corresponding to a static object. On the other hand, if a point cloud is detected for less than a certain period in a specific area or a point cloud is created and then disappears, the corresponding point cloud may be classified as a dynamic point cloud corresponding to a dynamic object.
- the processor 120 may divide the corresponding point cloud into a dynamic point cloud corresponding to a dynamic object and a static point cloud corresponding to a static object, based on a period in which point clouds for the same region of the spatial information map are continuously detected.
- the processor 120 removes dynamic point clouds corresponding to dynamic objects from all point clouds corresponding to the 3D space, or extracts only static point clouds corresponding to static objects from all point clouds corresponding to the 3D space.
- the whole can be determined as a static object area. That is, the processor 120 may determine the static object region based on the continuity of point groups for the same region in the frames of the spatial information map.
- the processor 120 Based on the period in which point clouds within a unit area corresponding to each other are continuously detected between frames of the spatial information map generated from the time-specific point clouds of the 3D space obtained from the sensor unit 110, the processor 120 detects static object area can be determined.
- the processor 120 may determine a static object region composed of static point clouds in a reference frame of a specific view of the spatial information map. For example, the processor 120 detects a static point cloud composed of static point clouds in a frame of a spatial information map corresponding to a certain point in time after the sensing device 100 is installed and a predetermined period required to distinguish a dynamic point cloud from a static point cloud has passed. object area can be determined.
- the processor 120 may determine the static object region based on a period in which point clouds within a unit region at positions corresponding to each other between frames of the spatial information map are continuously detected with a number of points equal to or greater than a minimum detection threshold. Even if point clouds are continuously detected for the same area, since the corresponding point group may contain noise, only cases where the number of points in the continuously detected point group maintains the number of points equal to or higher than the minimum detection threshold are included, This is to filter cases classified as static point clouds.
- the minimum detection threshold may be set to an appropriate value according to the environment in which the sensing device 100 is installed or weather conditions. In this case, the minimum detection threshold may be directly input from a user, or may be automatically set to an appropriate value according to the received information when the sensing device 100 receives information about external environments or weather conditions.
- the processor 120 may determine a static object region in the 3D space based on a point cloud of a designated area among point clouds for each time in the 3D space obtained from the sensor unit 110 .
- the sensing device 100 determines a static object region in the 3D space based on the point cloud of the designated region when a reference region in which dynamic point clouds rarely occur is designated instead of using the entire acquired point cloud for each time in the 3D space.
- the reference area may be designated by a user or may be automatically designated as an area determined to be an area of a static point group corresponding to a static object having an appropriate height and size using a model trained by the sensing device 100. .
- the processor 120 may generate a normalized map representing spatial information, such as a voxel map or a depth map, from the obtained temporal point cloud for the 3D space.
- a normalized map representing spatial information, such as a voxel map or a depth map
- the processor 120 distinguishes between static voxels and dynamic voxels in each frame of the voxel map, and determines a static object region composed of static voxels in a reference frame at a specific time of the voxel map. can decide
- FIG. 4 is a diagram for explaining a voxel map and voxels corresponding to unit regions of the voxel map.
- a voxel map corresponding to a 3D space monitored by the sensing device 100 is shown.
- a voxel map is composed of voxel arrays, and a voxel array may be composed of voxels.
- a voxel corresponds to a unit area of a voxel map, and each voxel may include a point cloud corresponding to a 3D space.
- a point cloud existing in a voxel is a dynamic point group
- the corresponding voxel is referred to as a dynamic voxel
- a point cloud existing in a voxel is a static point group
- it may be referred to as a static voxel.
- FIG. 5 is a diagram for explaining a process of distinguishing static voxels from dynamic voxels.
- the processor 120 may distinguish between static voxels in which a period in which point clouds within a voxel are continuously detected is equal to or greater than a threshold value, and dynamic voxels in which a period in which point clouds in a voxel are continuously detected is less than a threshold value.
- a specific voxel of the voxel map if a point cloud within the voxel is continuously detected over a certain period of time, since the point cloud corresponds to a static object, the corresponding voxel may be classified as a static voxel.
- a point cloud within the voxel is detected for less than a certain period of time or a point cloud is created and then disappears, the corresponding voxel can be classified as a dynamic voxel because the corresponding point cloud is a dynamic point group corresponding to a dynamic object.
- the first voxel (Voxel 1) can be referred to as a static voxel because a point cloud consisting of at least two points is continuously detected from the time T-10 to before the time T-1, and the second voxel (Voxel 2) is the time T-9. Since it is temporarily detected only at time points T-1 and T-1, it can be referred to as a dynamic voxel.
- the processor 120 may classify a voxel having a corresponding point cloud into a dynamic voxel and a static voxel based on a period in which point clouds for the same voxel of the voxel map are continuously detected.
- the processor 120 may remove dynamic voxels from all voxels of the voxel map or extract only static voxels from all voxels of the voxel map, thereby determining at least some or all of them as the static object region. That is, the processor 120 may determine the static object region based on the continuity of the point group for the voxel at the same position in the frames of the voxel map.
- the processor 120 determines the static object area based on the period in which point clouds in voxels corresponding to each other are continuously detected between frames of the voxel map generated from the point clouds for each time in the 3D space obtained from the sensor unit 110. can determine The processor 120 may determine a static object region composed of static voxels in a reference frame of a specific view of the voxel map.
- FIG. 6 is a diagram for explaining a process of determining an installation abnormality of a sensing device.
- the processor 120 may determine an installation abnormality of the sensing device 100 based on a static point cloud having a predetermined time difference among static point clouds by time corresponding to the static object area.
- the static object area may be determined from a reference frame at an arbitrary point in time when it is confirmed that the sensing device 100 is normally installed. For example, it may be determined from a frame of a spatial information map at a certain point in the past after the sensing device 100 is normally installed in a structure.
- the processor 120 may extract a first static point cloud of a first point in time and a second static point cloud of a second point of time having a predetermined time difference from the first point of time from the static point clouds by time corresponding to the determined static object region.
- the processor 120 may determine whether the sensing device 100 has an installation error based on a difference between the first static point cloud and the second static point cloud.
- a first static point cloud corresponding to a static object area and The second static point cloud is indicated by hatching.
- the processor 120 may compare a first static point cloud corresponding to a static object area of a first view with a second static point cloud corresponding to a static object area of a second view.
- the processor 120 compares the ratio of the area of the first static point cloud to the static object area with the ratio of the area of the second static point cloud to the static object area, and determines whether the sensing device 100 has an abnormal installation. can decide When the point of time at which the static object area is determined is the first point of view, as shown in FIG. 6 , the first static point cloud of the first point of view is evenly detected within the static object area, while the second static point cloud of the second point of view It can be seen that it is detected only in a part of the static object area. In FIG. 6 , the point in time for determining the static object area is determined as an arbitrary point in the past when the sensing device 100 is normally installed, but is not limited thereto.
- the sensing device 100 is obtained by comparing the first static point cloud at the current point in time with the second static point cloud at the past point in time. can determine whether or not the installation of
- the processor 120 determines the number of voxels constituting the determined static object region in the voxel map. It is possible to determine whether the sensing device 100 has an abnormal installation by comparing the ratio of the number of voxels constituting the first static point cloud at the first viewpoint to the ratio of the number of voxels constituting the second static point cloud at the second viewpoint. .
- the processor 120 executes one or more instructions and, through the communication interface 140, determines the installation abnormality of the sensing device 100 by using the point cloud for the 3D space obtained by the sensing device 100. It can be used for device control or transmitted to a server that provides a certain service.
- the server 200 may be replaced by a computing device, operator, console device, or the like.
- the sensing device 100 may be equipped with a processor that obtains a point cloud for each time as spatial information on a 3D space, detects a static or dynamic object in the 3D space, and tracks the detected object. , but is not limited thereto.
- the server 200 that utilizes the time-specific point cloud of the 3D space obtained by the sensing device 100 for device control or provides a predetermined service receives it, so that the server 200 is a static object or a dynamic object in the 3D space. , and can process a series of processing to track the detected object.
- the server 200 may be implemented with technology such as cloud computing.
- the server 200 may perform high-speed data communication with the sensing device 100 .
- the server 200 may include a memory 210 , a processor 220 , and a communication interface 230 .
- Those skilled in the art related to the present embodiment may know that other general-purpose components may be further included in addition to the components shown in FIG. 7 .
- the above description of the sensing device 100 may be applied as it is to the configuration of the same name of the server 200 even if the content is omitted below.
- Each component of the block diagram of FIG. 7 may be separated, added, or omitted according to the implementation method of the server 200 . That is, one component may be subdivided into two or more components, two or more components may be combined into one component, or some components may be added or removed depending on the implementation method.
- the memory 210 may store instructions executable by the processor 220 .
- Memory 210 may store software or programs.
- the processor 220 may execute instructions stored in the memory 210 .
- the processor 220 may perform overall control of the server 200 .
- the processor 220 may acquire information and requests received through the communication interface 230 and store the received information in a storage (not shown).
- the processor 220 may process received information.
- the processor 220 obtains information used to control a device or provide a predetermined service from information received from the sensing device 100, or performs a processing operation to manage the received information. , It can be stored in a storage (not shown).
- the processor 220 uses data or information stored in a storage (not shown) as a response to the request obtained from the manager's terminal, and sends the manager's terminal through the communication interface 230 a request corresponding to the request. information can be transmitted.
- the communication interface 230 may perform wired/wireless communication with other devices or networks.
- the communication interface 230 may be connected to a device located outside the server 200 to transmit/receive signals or data.
- the server 200 may communicate with the sensing device 100 through the communication interface 230 or may be connected to other servers connected through a network.
- the storage may store various software and information necessary for the server 200 to control devices or to provide predetermined services.
- the storage may store programs executed in the server 200, applications, and various data or information used for certain services.
- the server 200 may include a load balancer server and function servers providing predetermined services.
- the server 200 may be composed of a plurality of servers divided according to functions or may be an integrated server.
- the server 200 may acquire a point cloud for each time in a 3D space from the sensing device 100 through the communication interface 230 or receive a result of detecting an abnormal installation of the sensing device 100. there is.
- the server 200 When the server 200 is connected to a plurality of sensing devices 100 and a result of detecting an installation abnormality is received from one of the sensing devices 100, the server 200 detects an installation abnormality. Reception of information or data received from the sensing device 100 may be blocked or deleted.
- FIG. 8 is a flowchart illustrating a method of detecting an installation abnormality of the sensing device 100 according to an exemplary embodiment. A detailed description of the contents overlapping with the above description of the sensing device 100 will be omitted.
- the sensing device 100 may obtain a point cloud for each time in the 3D space using the LIDAR sensor.
- the sensing device 100 may continuously acquire point clouds in a 3D space.
- the sensing device 100 may determine a static object region in the 3D space based on the obtained point cloud for each time of the 3D space.
- the sensing device 100 may generate a spatial information map from the acquired time-specific point cloud for the 3D space.
- the spatial information map may be a normalized map representing spatial information, such as a voxel map or a depth map.
- the sensing device 100 may determine a static object area based on a period in which point clouds within a unit area at positions corresponding to each other are continuously detected between frames of the generated spatial information map.
- the sensing device 100 may determine a static object area based on a period in which point clouds within a unit area corresponding to each other between frames of the spatial information map are continuously detected with a number of points equal to or greater than a minimum detection threshold.
- the sensing device 100 classifies a static voxel in which a period in which a point cloud within a voxel is continuously detected is greater than or equal to a threshold value and a dynamic voxel in which the period is less than the threshold value;
- a static object region composed of static voxels may be determined in a reference frame at a specific time point of the voxel map.
- the sensing device 100 may determine a static object region in the 3D space based on a point cloud of a designated area among the obtained point clouds for each time in the 3D space.
- the sensing device 100 determines a static object region in the 3D space based on the point cloud of the designated region when a reference region in which dynamic point clouds rarely occur is designated instead of using the entire acquired point cloud for each time in the 3D space.
- the reference area may be designated by a user or may be automatically designated as an area determined to be an area of a static point group corresponding to a static object having an appropriate height and size using a model learned by the sensing device 100 .
- the sensing device 100 may determine an installation abnormality of the sensing device 100 based on a static point cloud having a predetermined time difference among static point clouds by time corresponding to the determined static object area. This will be described in detail with reference to FIG. 9 below.
- FIG. 9 is a detailed flowchart for explaining a process of determining an installation abnormality of the sensing device 100. Referring to FIG. Step 830 of FIG. 8 described above will be described in detail.
- step 910 the sensing device 100 extracts a first static point cloud at a first time point and a second static point cloud at a second time point having a predetermined time difference from the first time point among the static point clouds for each time period corresponding to the determined static object area. can do.
- the sensing device 100 may determine whether the sensing device 100 has an abnormal installation based on the difference between the first static point cloud and the second static point cloud.
- the sensing device 100 may determine whether the sensing device 100 has an abnormal installation by comparing the ratio of the area of the first static point group to the determined static object area with the ratio of the area of the second static point group. .
- a first static point cloud for the total number of voxels constituting the determined static object region is generated from the voxel map. It is possible to determine whether the sensing device 100 has an installation abnormality by comparing the ratio of the number of voxels constituting the second static point cloud with the ratio of the number of voxels constituting the second static point cloud.
- the sensing device 100 may be installed in a moving object as well as a non-moving structure.
- the sensing device 100 may be installed in a moving object as well as a non-moving structure.
- the sensing device 100 may be installed in a moving object as well as a non-moving structure.
- the sensing device 100 since the 3D space changes according to the movement of the vehicle or the drone, the location of a static object in the 3D space may change on the spatial information map. there is.
- a point cloud for the same space may be obtained based on location information of the vehicle or drone.
- the sensing device 100 in order to monitor a wider space, when a structure in which the sensing device 100 is installed periodically rotates at a predetermined angle, the sensing device 100 generates a point cloud for a 3D space obtained at the same angle. It can be obtained as a point cloud for the same space.
- each of the above-described embodiments may be provided in the form of a computer program or application stored in a medium in order to execute predetermined steps for performing a method of detecting an abnormality in the installation of the sensing device 100 .
- each of the above-described embodiments may be provided in the form of a computer program or application stored in a medium that causes at least one processor of the sensing device 100 to perform predetermined steps for performing a method for detecting an installation abnormality. there is.
- the above-described embodiments may be implemented in the form of a computer-readable storage medium storing instructions and data executable by a computer or processor. At least one of instructions and data may be stored in the form of a program code, and when executed by a processor, a predetermined program module may be generated to perform a predetermined operation.
- Such computer-readable storage media include read-only memory (ROM), random-access memory (RAM), flash memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs , DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, magnetic tapes, floppy disks, magneto-optical data storage devices, An optical data storage device, hard disk, solid-state disk (SSD), and may store instructions or software, related data, data files, and data structures, and may cause the processor or computer to execute instructions or It may be any device capable of providing software, associated data, data files, and data structures.
- ROM read-only memory
- RAM random-access memory
- flash memory CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs , DVD-Rs, DVD+R
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Theoretical Computer Science (AREA)
- Electromagnetism (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Astronomy & Astrophysics (AREA)
Abstract
Description
Claims (17)
- 라이다 센서를 이용하여 3차원 공간에 대한 시간별 점군을 획득하는 단계;상기 획득된 상기 3차원 공간에 대한 시간별 점군에 기초하여, 상기 3차원 공간의 정적 객체 영역을 결정하는 단계; 및상기 결정된 정적 객체 영역에 대응되는 시간별 정적 점군 중에서 소정의 시간 차를 갖는 정적 점군에 기초하여, 센싱 장치의 설치 이상을 판단하는 단계;를 포함하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제1 항에 있어서,상기 3차원 공간의 정적 객체 영역을 결정하는 단계는,상기 획득된 상기 3차원 공간에 대한 시간별 점군으로부터 생성되는 공간 정보 맵의 프레임들 간에 서로 대응되는 위치의 단위 영역 내의 점군이 연속적으로 검출되는 기간에 기초하여, 상기 정적 객체 영역을 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제2 항에 있어서,상기 3차원 공간의 정적 객체 영역을 결정하는 단계는,상기 공간 정보 맵의 프레임들 간에 서로 대응되는 위치의 단위 영역 내의 점군이 최소 검출 임계값 이상의 점 개수로 연속적으로 검출되는 기간에 기초하여, 상기 정적 객체 영역을 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제2 항에 있어서,상기 3차원 공간의 정적 객체 영역을 결정하는 단계는,상기 공간 정보 맵이 복셀 맵인 경우, 상기 복셀 맵의 각 프레임에서, 복셀 내의 점군이 연속적으로 검출되는 기간이 임계값 이상인 정적 복셀과 상기 임계값 미만인 동적 복셀을 구분하고, 상기 복셀 맵의 특정 시점의 기준 프레임에서 상기 정적 복셀로 이루어진 상기 정적 객체 영역을 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제1 항에 있어서,상기 3차원 공간의 정적 객체 영역을 결정하는 단계는,상기 획득된 상기 3차원 공간에 대한 시간별 점군 중에서 지정된 영역의 점군에 기초하여, 상기 3차원 공간의 정적 객체 영역을 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제1 항에 있어서,상기 센싱 장치의 설치 이상을 판단하는 단계는,상기 결정된 정적 객체 영역에 대응되는 시간별 정적 점군 중에서 제1 시점의 제1 정적 점군과 상기 제1 시점과 상기 소정의 시간 차를 갖는 제2 시점의 제2 정적 점군을 추출하는 단계; 및상기 제1 정적 점군과 상기 제2 정적 점군의 차이에 기초하여, 상기 센싱 장치의 설치 이상 여부를 결정하는 단계;를 포함하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제6 항에 있어서,상기 센싱 장치의 설치 이상 여부를 결정하는 단계는,상기 결정된 정적 객체 영역에 대한 상기 제1 정적 점군의 영역의 비율과 상기 제2 정적 점군의 영역의 비율을 비교하여, 상기 센싱 장치의 설치 이상 여부를 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 제6 항에 있어서,상기 센싱 장치의 설치 이상 여부를 결정하는 단계는,상기 획득된 상기 3차원 공간에 대한 시간별 점군으로부터 생성되는 복셀 맵에서, 상기 결정된 정적 객체 영역을 구성하는 전체 복셀의 개수에 대한 상기 제1 정적 점군을 구성하는 복셀의 개수의 비율과 상기 제2 정적 점군을 구성하는 복셀의 개수의 비율을 비교하여, 상기 센싱 장치의 설치 이상 여부를 결정하는, 센싱 장치의 설치 이상을 감지하는 방법.
- 라이다 센서를 이용하여 3차원 공간에 대한 시간별 점군을 획득하는 명령어들;상기 획득된 상기 3차원 공간에 대한 시간별 점군에 기초하여, 상기 3차원 공간의 정적 객체 영역을 결정 명령어들; 및상기 결정된 정적 객체 영역에 대응되는 시간별 정적 점군 중에서 소정의 시간 차를 갖는 정적 점군에 기초하여, 센싱 장치의 설치 이상을 판단하는 명령어들을 포함하는, 컴퓨터에서 실행시키기 위한 프로그램을 저장한 컴퓨터 판독 가능 저장매체.
- 라이다 센서를 이용하여 3차원 공간에 대한 시간별 점군을 획득하는 센서부;하나 이상의 명령어들을 저장하는 메모리; 및상기 하나 이상의 명령어들을 실행함으로써, 상기 획득된 상기 3차원 공간에 대한 시간별 점군에 기초하여, 상기 3차원 공간의 정적 객체 영역을 결정하고, 상기 결정된 정적 객체 영역에 대응되는 시간별 정적 점군 중에서 소정의 시간 차를 갖는 정적 점군에 기초하여, 센싱 장치의 설치 이상을 판단하는 프로세서;를 포함하는, 센싱 장치.
- 제10 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 획득된 상기 3차원 공간에 대한 시간별 점군으로부터 생성되는 공간 정보 맵의 프레임들 간에 서로 대응되는 위치의 단위 영역 내의 점군이 연속적으로 검출되는 기간에 기초하여, 상기 정적 객체 영역을 결정하는, 센싱 장치.
- 제11 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 복셀 맵의 각 프레임에서, 상기 연속적으로 검출되는 기간이 임계값 이상인 정적 복셀과 상기 임계값 미만인 동적 복셀을 구분하고, 상기 복셀 맵의 특정 시점에서의 기준 프레임에서 상기 정적 복셀로 이루어진 상기 정적 객체 영역을 결정하는, 센싱 장치.
- 제11 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 공간 정보 맵이 복셀 맵인 경우, 상기 복셀 맵의 각 프레임에서, 복셀 내의 점군이 연속적으로 검출되는 기간이 임계값 이상인 정적 복셀과 상기 임계값 미만인 동적 복셀을 구분하고, 상기 복셀 맵의 특정 시점의 기준 프레임에서 상기 정적 복셀로 이루어진 상기 정적 객체 영역을 결정하는, 센싱 장치.
- 제10 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 획득된 상기 3차원 공간에 대한 시간별 점군 중에서 지정된 영역의 점군에 기초하여, 상기 3차원 공간의 정적 객체 영역을 결정하는, 센싱 장치.
- 제10 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 결정된 정적 객체 영역에 대응되는 시간별 정적 점군 중에서 제1 시점의 제1 정적 점군과 상기 제1 시점과 상기 소정의 시간 차를 갖는 제2 시점의 제2 정적 점군을 추출하고, 상기 제1 정적 점군과 상기 제2 정적 점군의 차이에 기초하여, 상기 센싱 장치의 설치 이상 여부를 결정하는, 센싱 장치.
- 제15 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 결정된 정적 객체 영역에 대한 상기 제1 정적 점군의 영역의 비율과 상기 제2 정적 점군의 영역의 비율을 비교하여, 상기 센싱 장치의 설치 이상 여부를 결정하는, 센싱 장치.
- 제15 항에 있어서,상기 프로세서는 상기 하나 이상의 명령어들을 실행함으로써,상기 획득된 상기 3차원 공간에 대한 시간별 점군으로부터 생성되는 복셀 맵에서, 상기 결정된 정적 객체 영역을 구성하는 전체 복셀의 개수에 대한 상기 제1 정적 점군을 구성하는 복셀의 개수의 비율과 상기 제2 정적 점군을 구성하는 복셀의 개수의 비율을 비교하여, 상기 센싱 장치의 설치 이상 여부를 결정하는, 센싱 장치.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22861629.8A EP4394440A1 (en) | 2021-08-26 | 2022-08-18 | Method for sensing installation abnormality of sensing device, and sensing device for performing same |
US18/585,611 US20240192342A1 (en) | 2021-08-26 | 2024-02-23 | Method for sensing installation abnormality of sensing device, and sensing device for performing same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2021-0113404 | 2021-08-26 | ||
KR1020210113404A KR102656646B1 (ko) | 2021-08-26 | 2021-08-26 | 센싱 장치의 설치 이상을 감지하는 방법 및 이를 수행하는 센싱 장치 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/585,611 Continuation US20240192342A1 (en) | 2021-08-26 | 2024-02-23 | Method for sensing installation abnormality of sensing device, and sensing device for performing same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023027419A1 true WO2023027419A1 (ko) | 2023-03-02 |
Family
ID=85323268
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2022/012338 WO2023027419A1 (ko) | 2021-08-26 | 2022-08-18 | 센싱 장치의 설치 이상을 감지하는 방법 및 이를 수행하는 센싱 장치 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240192342A1 (ko) |
EP (1) | EP4394440A1 (ko) |
KR (1) | KR102656646B1 (ko) |
WO (1) | WO2023027419A1 (ko) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180036753A (ko) * | 2015-08-04 | 2018-04-09 | 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 | 레이저 포인트 클라우드 기반의 도시 도로 인식 방법, 장치, 저장 매체 및 기기 |
US20190086546A1 (en) * | 2016-03-14 | 2019-03-21 | Imra Europe S.A.S. | Processing method of a 3d point cloud |
KR102140973B1 (ko) * | 2019-03-27 | 2020-09-14 | 주식회사 이에스피 | LiDAR를 이용한 광산 갱내 붕괴 감시 시스템 |
KR20210037469A (ko) * | 2019-09-27 | 2021-04-06 | (주)서울로보틱스 | 3차원 공간에 대응되는 맵을 생성하는 차량 및 방법 |
KR20210103172A (ko) * | 2020-02-13 | 2021-08-23 | 공주대학교 산학협력단 | 방범 경비 시스템 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11995763B2 (en) * | 2018-07-02 | 2024-05-28 | Vayyar Imaging Ltd. | System and methods for environment mapping |
KR102647928B1 (ko) * | 2018-11-16 | 2024-03-15 | 현대모비스 주식회사 | 객체 검출 센서의 장착 오차 판단 장치 및 방법 |
CN111238494B (zh) * | 2018-11-29 | 2022-07-19 | 财团法人工业技术研究院 | 载具、载具定位系统及载具定位方法 |
-
2021
- 2021-08-26 KR KR1020210113404A patent/KR102656646B1/ko active IP Right Grant
-
2022
- 2022-08-18 WO PCT/KR2022/012338 patent/WO2023027419A1/ko active Application Filing
- 2022-08-18 EP EP22861629.8A patent/EP4394440A1/en active Pending
-
2024
- 2024-02-23 US US18/585,611 patent/US20240192342A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180036753A (ko) * | 2015-08-04 | 2018-04-09 | 바이두 온라인 네트웍 테크놀러지 (베이징) 캄파니 리미티드 | 레이저 포인트 클라우드 기반의 도시 도로 인식 방법, 장치, 저장 매체 및 기기 |
US20190086546A1 (en) * | 2016-03-14 | 2019-03-21 | Imra Europe S.A.S. | Processing method of a 3d point cloud |
KR102140973B1 (ko) * | 2019-03-27 | 2020-09-14 | 주식회사 이에스피 | LiDAR를 이용한 광산 갱내 붕괴 감시 시스템 |
KR20210037469A (ko) * | 2019-09-27 | 2021-04-06 | (주)서울로보틱스 | 3차원 공간에 대응되는 맵을 생성하는 차량 및 방법 |
KR20210103172A (ko) * | 2020-02-13 | 2021-08-23 | 공주대학교 산학협력단 | 방범 경비 시스템 |
Also Published As
Publication number | Publication date |
---|---|
EP4394440A1 (en) | 2024-07-03 |
KR102656646B1 (ko) | 2024-04-12 |
KR20230031035A (ko) | 2023-03-07 |
US20240192342A1 (en) | 2024-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2019419781B2 (en) | Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server | |
WO2019240452A1 (ko) | 실 공간에서 관심지점 관련 정보를 자동으로 수집 및 업데이트하는 방법 및 시스템 | |
WO2013058428A1 (ko) | 레이더를 이용한 멀티 모드 장애물 감지 방법 및 그 장치 | |
WO2020141694A1 (en) | Vehicle using spatial information acquired using sensor, sensing device using spatial information acquired using sensor, and server | |
WO2021162205A1 (ko) | 충돌 사고 방지 방법, 장치, 서버, 및 컴퓨터 프로그램 | |
EP2965041A1 (en) | Crowd sourced pathway maps | |
WO2023027421A1 (ko) | 작업 공간에서의 관심 영역을 모니터링하는 방법 및 센싱 장치 | |
CN109284801B (zh) | 交通指示灯的状态识别方法、装置、电子设备及存储介质 | |
WO2021060778A1 (ko) | 3차원 공간에 대응되는 맵을 생성하는 차량 및 방법 | |
WO2018146762A1 (ja) | 情報処理装置、車載装置、情報処理方法、通信方法、情報処理プログラム及び通信プログラム | |
WO2021235682A1 (en) | Method and device for performing behavior prediction by using explainable self-focused attention | |
WO2012011624A1 (ko) | 위치 추정에 사용자 이동성을 활용하는 와이파이 라디오 맵 기반 실내 네비게이션 시스템 및 방법 | |
WO2023027419A1 (ko) | 센싱 장치의 설치 이상을 감지하는 방법 및 이를 수행하는 센싱 장치 | |
WO2024140083A1 (zh) | 装载率测量方法、装置、设备及介质 | |
WO2021060599A1 (ko) | 레인을 검출하는 차량 및 방법 | |
CN112990168B (zh) | 一种违法用地监测方法及系统 | |
WO2009157653A2 (ko) | Rfid를 이용한 위치 확인 방법 및 장치 | |
WO2021167312A1 (ko) | 라이다 센서를 구비한 터치 인식 방법 및 장치 | |
KR101986015B1 (ko) | 다중 센서 시뮬레이션 장치 및 다중 센서 시뮬레이션 방법 | |
WO2023027422A1 (ko) | 3차원 공간의 관심 영역을 모니터링하는 방법 및 센싱 장치 | |
Kang et al. | Automated parking lot management system using embedded robot type smart car based on wireless sensors | |
CN115061386B (zh) | 智能驾驶的自动化仿真测试系统及相关设备 | |
CN115623531A (zh) | 利用无线射频信号的隐藏监控设备发现和定位方法 | |
WO2023132570A1 (ko) | 작업 현장을 모니터링하는 방법 및 장치 | |
KR20160072361A (ko) | IoT 및 영상처리 기반 위치추적을 통한 영유아 행동발달 분석과 안전 지킴 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22861629 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2024512021 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022861629 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022861629 Country of ref document: EP Effective date: 20240326 |