US10648795B2 - Distance measuring apparatus and distance measuring method - Google Patents
Distance measuring apparatus and distance measuring method Download PDFInfo
- Publication number
- US10648795B2 US10648795B2 US15/700,346 US201715700346A US10648795B2 US 10648795 B2 US10648795 B2 US 10648795B2 US 201715700346 A US201715700346 A US 201715700346A US 10648795 B2 US10648795 B2 US 10648795B2
- Authority
- US
- United States
- Prior art keywords
- points
- edge noise
- point
- target
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/14—Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
- G01S7/4876—Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
Definitions
- the embodiments discussed herein are related to a distance measuring apparatus, a distance measuring method, and a computer-readable storage medium.
- a distance measuring apparatus that uses a laser beam launches the laser beam in the form of pulses (hereinafter referred to as a “pulse laser beam”), and detects the laser beam reflected from a target.
- a distance to the target is measured from a TOF (Time Of Flight) ⁇ T of the laser beam that is launched from the distance measuring apparatus, reflected by the target, and returned and received by the distance measuring apparatus.
- TOF Time Of Flight
- ⁇ T Time Of Flight
- the distance to the target can be obtained from (c ⁇ T)/2, for example.
- a measured result of the distance measuring apparatus may in some cases be output in the form of a range image (or depth map) in which distance values at each of range (or distance) measurement points are arranged in an order of the raster scanned samples, for example.
- the range image may represent the distances to the range measurement points by pixels having colors according to the distances.
- a contour part of the target corresponds to a boundary part between the target and a background.
- the boundary part only a portion of the laser beam hits the target and is reflected by the target. For this reason, compared to the case in which the entire laser beam hits and is reflected by the target, an amount of the laser beam reflected from the boundary part is smaller, and a rise in an output signal waveform of a photodetector that detects the laser beam is more gradual.
- an amplitude of the output signal waveform of the photodetector that detects the laser beam reflected by the boundary part is lower compared to the case in which the entire laser beam hits and is reflected by the target, and a timing when the amplitude of the photodetector output exceeds a threshold value for judging the time when the laser beam reaches the target is delayed. Furthermore, compared to the case in which the entire laser beam hits and is reflected by the target, the measured TOF value becomes larger by an amount corresponding to the delay in the amplitude of the photodetector output exceeding the threshold value. For these reasons, the TOF value that is measured by detecting the laser beam reflected from the boundary part of the target becomes larger than the TOF value representing the actual distance from the boundary part.
- the noise caused by the boundary part of the target is referred to as “edge noise”.
- edge noise When a three-dimensional image of the target is generated based on the range image representing the distances to each of the range measurement points, for example, the edge noise appears behind the target in the three-dimensional image.
- the conventional distance measuring apparatus it is impossible to distinguish whether the output signal waveform of the photodetector represents the distance to the target or the distance from the boundary part of the target. Consequently, it is difficult to detect the edge noise from the range image representing the distances to each of the range measurement points.
- a distance measuring apparatus includes a sensor configured to make a two-dimensional scan by launching a pulse laser beam and to measure a distance to a target based on reflected light received from the target; a memory configured to store a program; and a processor configured to execute the program and perform a process that includes generating a difference binary image from a first range image that is generated in a state in which the target does not exist with respect to a background and represents distances to each of a plurality of range measurement points, and a second range image that is generated in a state in which the target exists with respect to the background and represents the distances to each of the plurality of range measurement points; extracting a first region having a size that is greater than or equal to a first threshold value, from a non-background region of the difference binary image made up of a plurality of non-background points; grouping adjacent points on the second range image into a plurality of groups of adjacent points having close distance values, for each of the points within the first region of the
- FIG. 1 is a diagram illustrating an example of a distance measuring apparatus in one embodiment
- FIG. 2 is a block diagram illustrating an example of a computer
- FIG. 3 is a flow chart for explaining an example of a distance measuring process in a first embodiment
- FIG. 4 is a diagram for explaining an example of a process of step S 1 ;
- FIG. 5 is a diagram for explaining an example of a process of step S 2 ;
- FIG. 6 is a diagram for explaining an example of a process of step S 3 ;
- FIG. 7 is a diagram for explaining adjacent points having close distance values
- FIG. 8 is a diagram for explaining an example of a process of step S 4 ;
- FIG. 9 is a diagram for explaining an example of a 3-dimensional image generation of a target based on a current range image that is not subjected to a noise reduction process
- FIG. 10 is a diagram for explaining an example of a 3-dimensional image generation of the target based on a range image that is subjected to a noise reduction process
- FIG. 11 is a flow chart for explaining, in more detail, the example of the distance measuring process in the first embodiment
- FIG. 12 is a flow chart for explaining, in more detail, the example of the distance measuring process in the first embodiment
- FIG. 13 is a flow chart for explaining an example of the distance measuring process in a second embodiment
- FIG. 14 is a diagram for explaining an example of processes of steps S 41 to S 43 ;
- FIG. 15 is a diagram for explaining an example of a process of step S 41 ;
- FIG. 16 is a diagram for explaining an example of judging an inner side and an outer side of a contour line
- FIG. 17 is a diagram for explaining the example of judging the inner side and the outer side of the contour line
- FIG. 18 is a diagram for explaining an example of a process of step S 42 ;
- FIG. 19 is a diagram for explaining an example of a process of step S 43 ;
- FIG. 20 is a diagram for explaining an example of the noise reduction process.
- FIG. 21 is a flow chart for explaining, in more detail, the example of the distance measuring process in the second embodiment.
- the disclosed distance measuring apparatus, distance measuring method, and computer-readable storage medium measure a distance to a target based on reflected light received from the target in response to making a two-dimensional scan by launching a pulse laser beam.
- a distance measuring process includes
- FIG. 1 is a diagram illustrating an example of a distance measuring apparatus in one embodiment.
- the distance measuring apparatus illustrated in FIG. 1 has a sensor 1 including a light launch unit or device (hereinafter simply referred to as a “launch unit”) 2 and a light reception unit or device (hereinafter simply referred to as a “reception unit”) 3 , and a computer 4 .
- a sensor 1 including a light launch unit or device (hereinafter simply referred to as a “launch unit”) 2 and a light reception unit or device (hereinafter simply referred to as a “reception unit”) 3 , and a computer 4 .
- launch unit a light launch unit or device
- reception unit hereinafter simply referred to as a “reception unit”
- the launch unit 2 may have a known configuration including a control circuit 21 , a light emission circuit 22 , a laser light source 23 , a two-dimensional MEMS (Micro Electro Mechanical System) mirror 34 , and a scan angle magnification lens 25 , for example.
- the launch unit 2 is a hardware device.
- the control circuit 21 controls the light emission circuit 22 so that the laser light source 23 emits a pulse laser beam under the control of the light emission circuit 22 .
- the control circuit 21 controls a known drive part (not illustrated) that drives the two-dimensional MEMS mirror 34 two-dimensionally, so that the laser beam emitted from the laser light source 23 scans two-dimensionally (for example, in a horizontal direction and a vertical direction with respect to ground surface).
- the laser beam makes a two-dimensional scan via the scan angle magnification lens 25 as indicated by a solid line in FIG. 1 .
- the laser beam reflected by a target 100 which is an example of an object to be measured, is received by the reception unit 3 as indicated by a dotted line in FIG. 1 .
- the laser light source 23 may have a known configuration to emit an infrared or near-infrared laser beam, for example.
- the reception unit 3 may have a known configuration including a reception lens 31 , a photodetector 32 , and a distance measuring circuit 33 .
- the reception unit 3 is a hardware device.
- the laser beam deflected by the target 100 is detected by the photodetector 32 via the reception lens 31 .
- a detection output of the photodetector 32 is supplied to the distance measuring circuit 33 .
- the distance measuring circuit 33 measures a TOF (Time Of Flight) ⁇ T of the laser beam that is launched from the distance measuring apparatus, reflected by the target 100 , and returned and received by the distance measuring apparatus, to optically measure the distance to the target 100 and output a signal indicating the measured distance.
- TOF Time Of Flight
- the distance to the target 100 can be obtained from (c ⁇ T)/2, for example.
- the signal indicating the distances to each of range measurement points measured by the distance measuring circuit 33 is output from the reception unit 3 to the computer 4 .
- the range measurement points refer to points on the target 100 to be measured, where the laser beam launched from the sensor 1 reaches.
- the range measurement points include points on the target 100 , and points on a background within a scan range of the laser beam.
- the distances to the range measurement points refer to the distances from the sensor 1 to the range measurement points.
- the computer 4 generates a range image (or depth map) representing the distances to each of the range measurement points, based on the signal indicating the distances to each of the range measurement points.
- the computer 4 generates a range image representing the distances to each of the range measurement points by pixels having colors according to the distances, based on the signal indicating the distances to each of the range measurement points. As indicated by a dotted line in FIG. 1 , the computer 4 may set a light emission timing, a light emission power, or the like of the laser light source 23 that is controlled by the control circuit 21 of the launch unit 2 .
- the computer 4 may have a configuration illustrated in FIG. 2 , for example.
- FIG. 2 is a block diagram illustrating an example of the computer.
- the computer 4 illustrated in FIG. 2 includes a processor 41 , a memory 42 , and input device 43 , a display device 44 , and an interface (or communication device) 45 that are connected to each other via a bus 40 .
- the processor 41 may be formed by a CPU (Central Processing Unit) or the like, for example.
- the processor 41 executes one or more programs stored in the memory 42 , to control the entire computer 4 .
- the memory 42 may be formed by computer-readable storage medium such as a semiconductor memory device, a magnetic recording medium, an optical recording medium, a magneto-optic recording medium, or the like, for example.
- the memory 42 stores various programs including a distance measuring program executed by the processor 41 , various data, or the like.
- the memory 42 may be formed by a non-transitory computer-readable storage medium that stores at least one program to be executed by the
- the input device 43 may be formed by a keyboard or the like that is operated by a user (or operator) to input commands and data to the processor 41 .
- the display device 44 may display messages, measured results of the distance measuring process, such as the range images, or the like with respect to the user.
- the interface 45 may communicably connect the computer 4 to another computer or the like. In this example, the computer 4 is connected to the control circuit 21 of the launch unit 2 via the interface 45 .
- the computer 4 is not limited to a hardware configuration in which constituent elements of the computer 4 are connected via the bus 40 as illustrated in FIG. 2 .
- the computer 4 may be formed by a general-purpose computer, for example.
- the input device 43 and the display device 44 of the computer 4 may be omitted.
- the output of the sensor 1 (that is, the output of the distance measuring circuit 33 ) may be connected to the bus 40 , or may be connected directly to the processor 41 .
- FIG. 3 is a flow chart for explaining an example of a distance measuring process in a first embodiment.
- the processor 41 of the computer 4 illustrated in FIG. 2 may execute a program stored in the memory 42 , to perform the distance measuring process illustrated in FIG. 3 .
- the processor 41 generates a difference binary image from two kinds of range images representing the distances to each of the range measurement points by pixels having colors according to the distance values, for example. More particularly, as illustrated in FIG. 4 , a difference binary image 103 is generated from a background range image 101 and a current range image 102 .
- FIG. 4 is a diagram for explaining an example of a process of step S 1 .
- the background range image 101 refers to a range image that is generated in a state in which no target exists with respect to the background, for example.
- the current range image 102 refers to a range image that is generated in a state in which the target 100 exists with respect to the background.
- the whiter the pixel the closer the distance is to a corresponding range measurement point, and the darker the halftone of the pixel, the farther the distance is to the corresponding range measurement point.
- a black pixel indicates a distance to the corresponding range measurement point that is outside a measurable range.
- two different tone levels of the halftone are merely used to distinguish the different binary values, and do not indicate the distance values to each of the range measurement points as in the case of the tone levels of the halftone used in the background range image 101 and the current range image 102 described above.
- One point at a two-dimensional coordinate within the range image representing the distances to each of the range measurement points by the pixels having the colors according to the distance values may correspond to one pixel of the range image, for example, or may correspond to a plurality of pixels of the range image. Whether to make one point at the two-dimensional coordinate within the range image representing the distances to each of the range measurement points by the pixels having the colors according to the distance values correspond to one pixel of the range image, or correspond to a plurality of pixels of the range image, may be selected depending on a pulse period of the pulse laser beam, a resolution required of the range image, or the like, for example.
- step S 2 the processor 41 extracts a first region 100 A having a size that is greater than or equal to a first threshold value, from a non-background region of the difference binary image 103 made up of non-background points, as illustrated in FIG. 5 .
- FIG. 5 is a diagram for explaining an example of a process of step S 2 .
- step S 3 the processor 41 groups adjacent points on the current range image 102 into groups of adjacent points having close distance values, for each of the points within the first region 100 A of the difference binary image 103 , to extract second regions 100 B respectively corresponding to each group of adjacent points having the close distance values, as illustrated in FIG. 6 .
- FIG. 6 is a diagram for explaining an example of a process of step S 3 .
- FIG. 6 illustrates each of the second regions 100 B by a different tone level of the halftone, however, the tone level does not indicate the distance value to each of the range measurement points.
- FIG. 7 is a diagram for explaining adjacent points having close distance values.
- eight adjacent points are provided adjacent to a target point P 1 indicated by hatchings, for example.
- the adjacent points are mutually adjacent points.
- the points having the close distance values refer to points having distance values with a difference that is less than or equal to a threshold value.
- step S 4 the processor 41 extracts a third region 100 -B 1 having a size that is less than or equal to a second threshold value, from the second regions 100 B of the difference binary image 103 , and outputs a judgment result which judges that each point within the third region 100 B- 1 is the edge noise.
- the judgment result of step S 4 may be displayed on the display device 44 illustrated in FIG. 2 , for example, or may be output to an external apparatus (not illustrated), such as another computer or the like, via the interface 45 illustrated in FIG. 2 , for example.
- FIG. 8 is a diagram for explaining an example of a process of step S 4 .
- points 100 C within a region indicated by grey halftone correspond to points forming the target 100
- points 100 D within a white region correspond to points forming the edge noise
- points within a black region correspond to points forming the background.
- step S 4 even in the case in which the target 100 is a human being whose legs are not spread open, for example, it is possible to detect the edge noise between the legs.
- step S 5 a noise reduction process of step S 5 may be performed after step S 4 .
- the processor 41 performs the noise reduction process to reduce the edge noise detected in step S 4 , from the current range image 102 illustrated in FIG. 9 .
- a result of the noise reduction process of step S 5 may be displayed on the display device 44 illustrated in FIG. 2 , for example, or may be output to an external apparatus (not illustrated), such as another computer or the like, via the interface 45 illustrated in FIG. 2 , for example.
- the current range image 102 illustrated in FIG. 9 is the same as the current range image 102 illustrated in FIG. 4 .
- By performing the noise reduction process it is possible to generate a range image 104 from which the edge noise has been reduced or eliminated, as illustrated in FIG. 10 . Accordingly, the accuracy of the range image can be improved by performing the noise reduction process.
- FIG. 9 is a diagram for explaining an example of a 3-dimensional image generation of a target based on a current range image that is not subjected to the noise reduction process.
- FIG. 10 is a diagram for explaining an example of a 3-dimensional image generation of the target based on a range image that is subjected to the noise reduction process.
- the whiter the pixel the closer the distance is to the corresponding range measurement point, and the darker the halftone of the pixel, the farther the distance is to the corresponding range measurement point.
- the black pixel indicates the distance to the corresponding range measurement point that is outside a measurable range.
- the computer 4 may form an example of a device or means that performs the process of step S 1 , to generate the difference binary image 103 from the background range image 101 and the current range image 102 .
- the background range image 101 is an example of a first range image that is generated in a state in which the target 100 does not exist with respect to the background and represents the distances to each of the range measurement points.
- the current range image 102 is an example of a second range image that is generated in a state in which the target 100 exists with respect to the background and represents the distances to each of the range measurement points.
- the computer 4 may form an example of a device or means that performs the process of step S 2 , to extract the first region 100 A having the size that is greater than or equal to the first threshold value, from the non-background region of the difference binary image 103 made up of the non-background points.
- the computer 4 may form an example of a device or means that performs the process of step S 3 , to group adjacent points on the current range image 102 into groups of adjacent points having close distance values, for each of the points within the first region 100 A of the difference binary image 103 , and to extract the second regions 100 B respectively corresponding to each group of adjacent points having the close distance values.
- the computer 4 may form an example of a device or means that performs the process of step S 4 , to extract the third region 100 -B 1 having the size that is less than or equal to the second threshold value, from the second regions 100 B of the difference binary image 103 , and to output the judgment result which judges that each point 100 D within the third region 100 B- 1 is the edge noise ( 100 D).
- the computer 4 may form an example of a device or means that performs the process of step S 5 , to perform the noise reduction process to reduce the edge noise 100 D.
- the computer 4 may also form a device or means that performs the process to generate the range image that represents the distances to the range measurement points by the pixels having the colors according to the distances.
- FIGS. 11 and 12 are flow charts for explaining, in more detail, the example of the distance measuring process in the first embodiment.
- the processor 41 of the computer 4 illustrated in FIG. 2 may execute a program stored in the memory 42 , to perform the distance measuring process illustrated in FIGS. 11 and 12 .
- step S 11 illustrated in FIG. 11 the processor 41 generates the difference binary image 103 from the background range image 101 and the current range image 102 illustrated in FIG. 4 .
- the black background corresponds to a binary value “0”
- the grey non-background corresponds to a binary value “1”.
- the result of the process of step S 11 may correspond to the result of the process of step S 1 illustrated in FIG. 3 .
- step S 12 the processor 41 judges whether the difference binary image 103 includes an unprocessed target point having a binary value “1”.
- the process advances to step S 13 when the judgment result in step S 12 is YES, and the process advances to step S 16 which will be described later when the judgment result in step S 12 is NO.
- step S 13 the processor 41 judges whether the adjacent points adjacent to the target point includes an unprocessed adjacent point.
- step S 14 the processor 41 judges whether the unprocessed adjacent point has a binary value “1”.
- step S 15 the processor 41 groups the target point and the adjacent point having the binary value “1” into the same group, and the process returns to step S 12 .
- step S 17 the processor 41 judges whether the number idx is the number of regions or less. The process advances to step S 18 when the judgment result in step S 17 is YES, and the process advances to step S 21 which will be described later when the judgment result in step S 17 is NO.
- step S 18 the processor 41 judges whether the size of the region assigned the number idx is less than or equal to a threshold size.
- step S 19 the processor 41 excludes the region assigned the number idx from candidates of the first region 100 A having the size greater than or equal to the first threshold value, and the process advances to step S 20 .
- step S 17 The result of the process in a case in which the judgment result in step S 17 is NO corresponds to the result of the process of step S 2 illustrated in FIG. 3 , and the first region 100 A can be extracted.
- step S 22 the processor 41 judges whether the number idx1 is less than or equal to the number of first regions 100 A. The process advances to step S 23 when the judgment result in step S 22 is YES, and the process advances to step S 31 which will be described later in conjunction with FIG. 12 when the judgment result in step S 22 is NO.
- step S 23 the processor 41 judges whether an unprocessed target point exists in the first region 100 A assigned the number idx1.
- step S 25 the processor 41 judges whether the adjacent points adjacent to the target point includes an unprocessed adjacent point.
- step S 26 the processor 41 judges whether the adjacent points adjacent to the target point includes an unprocessed adjacent point.
- step S 26 the processor 41 judges whether a difference between the distance value of the target point and the distance value of the unprocessed adjacent point adjacent to the target point is less than or equal to a threshold value.
- step S 27 when the judgment result in step S 26 is YES, and the process returns to step S 23 when the judgment result in step S 26 is NO.
- step S 27 the processor 41 groups the target point, and the unprocessed adjacent point that is adjacent to the target point and has the distance value with the difference from the distance value of the target point less than or equal to the threshold value, into the same group, and the process returns to step S 23 .
- step S 22 The result of the process in a case in which the judgment result in step S 22 is NO corresponds to the result of the process of step S 3 illustrated in FIG. 3 , and the second region 100 B can be extracted.
- step S 32 the processor 41 judges whether the number idx2 is less than or equal to the number of second regions 100 B.
- the process advances to step S 33 when the judgment result in step S 32 is YES, and the process advances to step S 36 when the judgment result in step S 32 is NO.
- step S 33 the processor 41 judges whether the size of the second region 100 B assigned the number idx2 is less than or equal to the second threshold value.
- step S 34 the processor 41 detects each point within the second region 100 B having the size less than or equal to the second threshold value, as the point forming the edge noise, and the process advances to step S 35 .
- the result of the process in a case in which the judgment result in step S 32 is NO corresponds to the result of the process of step S 4 illustrated in FIG. 3 .
- the judgment result that is output indicates that each point within each third region 100 B- 1 is judged as being the edge noise. More particularly, in step S 36 , the processor 41 outputs the judgment result indicating that each point within each third region 100 B- 1 is judged as being the edge noise, and the process ends.
- the judgment result that is output may be displayed on the display device 44 illustrated in FIG. 2 , for example, or may be output to an external apparatus (not illustrated), such as another computer or the like, via the interface 45 illustrated in FIG. 2 , for example.
- a noise reduction process similar to that of step S 5 illustrated in FIG. 3 may be performed after step S 36 , to reduce the edge noise that is detected.
- the detected edge noise may be reduced or eliminated from the range image by performing the noise reduction process, to improve the accuracy of the range image.
- FIG. 13 is a flow chart for explaining an example of the distance measuring process in a second embodiment.
- the processor 41 of the computer 4 illustrated in FIG. 2 may execute a program stored in the memory 42 , to perform the distance measuring process illustrated in FIG. 13 .
- steps S 1 to S 4 are the same as the corresponding steps of the first embodiment illustrated in FIG. 3 .
- the points that are detected as forming the edge noise by steps S 1 to S 4 of the first embodiment described above are regarded as edge noise candidate points.
- a first point in contact with the background, or a second point in contact with an outer side of a contour line of the target, having a predetermined length or longer is judged as being the point forming the edge noise.
- points other than the first or second point are excluded from the points forming the edge noise, to further improve the edge noise detection accuracy.
- the points other than the first and second points, amongst the edge noise candidate points are excluded from the points forming the edge noise.
- step S 41 illustrated in FIG. 13 the processor 41 regards the points that are detected as forming the edge noise by the distance measuring process of the first embodiment described above, as the edge noise candidate points, and judges the edge noise candidate point in contact with the background.
- step S 42 the processor 41 judges the edge noise candidate point in contact with the outer side of the contour line of the target, having the predetermined length or longer.
- step S 43 the processor 41 excludes the edge noise candidate points other than the edge noise candidate points judged in steps S 41 and S 42 , from the points forming the edge noise. Further, in step S 43 , the processor 41 outputs a judgment result indicating that the edge noise candidate points judged in steps S 41 and S 42 are the points forming the edge noise.
- step S 43 may be displayed on the display device 44 illustrated in FIG. 2 , for example, or may be output to an external apparatus (not illustrated), such as another computer or the like, via the interface 45 illustrated in FIG. 2 , for example.
- a noise reduction process similar to that of step S 5 illustrated in FIG. 3 may be performed after step S 43 , to reduce the edge noise that is detected.
- FIG. 14 is a diagram for explaining an example of processes of steps S 41 to S 43 .
- each point (corresponding to one or more pixels) is illustrated by an enlarged square.
- a point 200 C indicated by a grey halftone corresponds to a target 200
- a white point 200 D corresponds to the edge noise
- a black point corresponds to the background.
- the target 200 is a human being.
- a point 200 D- 1 is detected as the edge noise due to the effects of clothing or accessories worn by the target 200 and having a relatively low reflectivity.
- the point 200 D- 1 deteriorates the accuracy of a recognition process or the like that is subsequently performed using a difference binary image 203 , when a point group density at the part of the target 200 decreases. For this reason, in order to prevent the point 200 D- 1 from being detected as the edge noise or eliminated, it is desirable to replace the point 200 D- 1 by a point that does not form the edge noise or a point that is at the same distance as the target 200 . On the other hand, a point 200 D- 2 is located between the legs of the target 200 and is detected as the edge noise. If the point 200 D- 2 between the legs of the target 200 were replaced by the point that is at the same distance as the target 200 , the legs would appear as if the legs were connected in the range image. For this reason, it is desirable to judge the point 200 D- 2 as the edge noise or to eliminate the point 200 D- 2 as the edge noise.
- the processes of steps S 41 to S 43 obtain from the difference binary image 203 a difference binary image 210 in which the edge noise is narrowed down.
- a difference binary image 210 in which the edge noise is narrowed down.
- points 200 E indicated by leftwardly declining hatching patterns and points 200 F indicated by lattice patterns respectively form the contour line of the target 200 , having the predetermined length or longer.
- FIG. 15 is a diagram for explaining an example of a process of step S 41 .
- the point 200 C indicated by the grey halftone corresponds to the point forming the target 200
- the white point 200 D corresponds to the point forming the edge noise
- the black point corresponds to the point forming the background.
- the processor 41 regards the point 200 D that is detected from the difference binary image 203 as forming the edge noise by the distance measuring process, as the edge noise candidate point, and generates the difference binary image 204 in which the point 200 E is judged as the edge noise candidate point in contact with the background.
- the edge noise candidate point 200 E in contact with the background is indicated by a symbol “x” in FIG. 15 .
- FIGS. 16 and 17 are diagrams for explaining an example of judging an inner side and an outer side of a contour line.
- the target point is indicated by a black square, and numbers “0” to “7” are assigned clockwise to the adjacent points adjacent to the target point.
- the Nth point is taken as a reference. When viewed from this Nth reference point, the (N ⁇ 1)th point is assigned the number “7”, and the (N+1)th point is assigned the number “3”.
- the points assigned the numbers “0” to “3” clockwise correspond to the outer side of the contour line
- the points assigned the numbers “4” to “6” clockwise correspond to the inner side of the contour line.
- these points adjacent to the outer side of the contour line of the target 200 can be judged as being the edge noise candidate points.
- FIG. 18 is a diagram for explaining an example of a process of step S 42 .
- the processor 41 judges, in step S 42 , the edge noise candidate points 200 F in contact with the outer side of the contour line of the target 200 , having the predetermined length or longer, by the method described above in conjunction with FIGS. 16 and 17 .
- the edge noise candidate point 200 F in contact with the outer side of the contour line of the target 200 , having the predetermined length or longer is also indicated by a symbol “x” in FIG. 18 .
- FIG. 19 is a diagram for explaining an example of a process of step S 43 .
- the processor 41 excludes, in step S 43 , edge noise candidate points 200 G other than the edge noise candidate points 200 E and 200 F judged in steps S 41 and S 42 , from the points 200 D forming the edge noise.
- the processor 41 outputs a judgment result including a difference binary image 206 in which the edge noise candidate points 200 E and 200 F judged in steps S 41 and S 42 are judged as being the points 200 D forming the edge noise.
- the edge noise candidate point 200 G excluded from the point forming the edge noise is indicated by a symbol “o” in FIG. 19 .
- FIG. 20 is a diagram for explaining an example of the noise reduction process.
- the process may end after step S 43 .
- the noise reduction process of step S 5 may be performed after step S 43 .
- the processor 41 performs the noise reduction process to reduce the edge noise detected in step S 43 . More particularly, the processor 41 obtains a difference binary image 207 illustrated in FIG. 20 by eliminating the points 200 D forming the edge noise from the difference binary image 206 generated in step S 43 , and outputs the result of the noise reduction process.
- the points 200 D forming the edge noise ( 200 D) that is eliminated correspond to the edge noise candidate points excluding the edge noise candidate points 200 G other than the edge noise candidate points 200 E and 200 F judged in steps S 41 and S 42 .
- the result of the noise reduction process of step S 5 may be displayed on the display device 44 illustrated in FIG. 2 , for example, or may be output to an external apparatus (not illustrated), such as another computer or the like, via the interface 45 illustrated in FIG. 2 , for example.
- the points 200 G indicated by the symbol “o” in FIG. 19 are the points detected as the edge noise due to the effects of the clothing or the accessories worn by the target 200 and having the relatively low reflectivity.
- the points 200 G deteriorate the accuracy of the recognition process or the like that is subsequently performed using the difference binary image 206 , when the point group density at the part of the target 200 decreases. For this reason, in order to prevent the points 200 G from being detected as the edge noise or eliminated, this example replaces the points 200 G by a point that does not form the edge noise or a point that is at the same distance as the target 200 .
- the points 200 E are located between the legs of the target 200 and are detected as the edge noise.
- this example judges the points 200 E as being the edge noise or eliminates the points 200 E as the edge noise.
- the computer 4 may form an example of a device or means that performs the processes of steps S 41 to S 43 , to regard the points forming the edge noise 200 D obtained by the processes of steps S 1 to S 4 , as the edge noise candidate points, and amongst the edge noise candidate points, to judge the first points 200 E in contact with the background, or the second points 200 F in contact with the outer side of the contour line of the target 200 , having the predetermined length or longer, as the points forming the edge noise, and to narrow down the edge noise 200 D by excluding the points 200 G other than the first and second points 200 E and 200 F.
- the computer 4 may form an example of a device or means that performs a process to replace the excluded edge noise candidate point 200 G by a point having the same distance value as the target 200 .
- the computer 4 may form an example of a device or means that performs a noise reduction process to reduce or eliminate the edge noise 200 D from the current range image 102 .
- FIG. 21 is a flow chart for explaining, in more detail, the example of the distance measuring process in the second embodiment.
- the processor 41 of the computer 4 illustrated in FIG. 2 may execute a program stored in the memory 42 , to perform the distance measuring process illustrated in FIG. 21 .
- the processor 41 regards the points that are detected as forming the edge noise by steps S 1 to S 4 of the distance measuring process of the first embodiment described above, as the edge noise candidate points.
- the processor 41 judges whether an unprocessed edge noise candidate point exists. The process advances to step S 53 when the judgment result in step S 52 is YES, and the process advances to step S 55 which will be described later when the judgment result in step S 52 is NO.
- step S 53 the processor 41 judges whether the adjacent point adjacent to the unprocessed edge noise candidate point is included in the background. The process advances to step S 54 when the judgment result in step S 53 is YES, and the process returns to step S 52 when the judgment result in step S 53 is NO. In step S 54 , the processor 41 judges the edge noise candidate point as being the edge noise point, and the process returns to step S 52 .
- step S 52 The result of the process in a case in which the judgment result in step S 52 is NO corresponds to the result of the process of step S 41 illustrated in FIG. 13 , and the difference binary image 204 illustrated in FIG. 15 can be generated.
- step S 55 the processor 41 extracts, from the difference binary image 203 that includes the points forming the edge noise and detected by the processes of steps S 1 to S 4 of the first embodiment, the contour line of the target 200 amongst the points 200 C forming the target 200 .
- step S 57 the processor 41 judges whether the number idx3 is less than or equal to the number of contour lines. The process advances to step S 58 when the judgment result in step S 57 is YES, and the process advances to step S 64 which will be described later when the judgment result in step S 57 is NO.
- step S 58 the processor 41 judges whether a length of the contour line assigned the number idx3 is greater than or equal to a predetermined length.
- the process advances to step S 59 when the judgment result in step S 58 is YES, and the process advances to step S 61 which will be described later when the judgment result in step S 58 is NO.
- step S 60 the processor 41 judges whether the point on the contour line, assigned the number idx4, is less than or equal to the length of the contour line assigned the number idx3.
- the process advances to step S 62 when the judgment result in step S 60 is YES, and the process advances to step S 61 when the judgment result in step S 60 is NO.
- step S 62 the processor 41 judges whether the adjacent points on the outer side of the point that is assigned the number idx4 and is on the contour line that is assigned the number idx3 include an edge noise candidate point.
- the process advances to step S 63 when the judgment result in step S 62 is YES, and the process advances to step S 64 when the judgment result in step S 62 is NO.
- step S 63 the processor 41 judges that the edge noise candidate point judged in step S 62 forms the edge noise, that is, is an edge noise point, and the process advances to step S 64 .
- step S 57 The result of the process in a case in which the judgment result in step S 57 is NO corresponds to the result of the process of step S 42 illustrated in FIG. 13 , and the difference binary image 205 illustrated in FIG. 18 can be generated.
- step S 65 the processor 41 judges that all remaining edge noise candidates are not edge noise points, and the process ends.
- the result of the process of step S 65 corresponds to the result of the process of step S 43 illustrated in FIG. 13 , and the difference binary image 206 illustrated in FIG. 19 can be generated.
- a noise reduction process similar to that of step S 5 illustrated in FIG. 13 may be performed after step S 65 , to reduce the edge noise that is detected.
- the detected edge noise may be reduced or eliminated from the range image by performing the noise reduction process, to improve the accuracy of the range image.
- the point in contact with the background, or the point in contact with the outer side of the contour line of the target, having the predetermined length or longer is judged as being the point forming the edge noise.
- Other points amongst the edge noise candidate points are excluded from the point forming the edge noise, to further improve the edge noise detection accuracy.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (20)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016202719A JP6862750B2 (en) | 2016-10-14 | 2016-10-14 | Distance measuring device, distance measuring method and program |
| JP2016-202719 | 2016-10-14 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20180106598A1 US20180106598A1 (en) | 2018-04-19 |
| US10648795B2 true US10648795B2 (en) | 2020-05-12 |
Family
ID=59858632
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/700,346 Expired - Fee Related US10648795B2 (en) | 2016-10-14 | 2017-09-11 | Distance measuring apparatus and distance measuring method |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10648795B2 (en) |
| EP (1) | EP3309583B1 (en) |
| JP (1) | JP6862750B2 (en) |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6814053B2 (en) * | 2017-01-19 | 2021-01-13 | 株式会社日立エルジーデータストレージ | Object position detector |
| US10515463B2 (en) * | 2018-04-20 | 2019-12-24 | Sony Corporation | Object segmentation in a sequence of color image frames by background image and background depth correction |
| CN108828613B (en) * | 2018-05-24 | 2022-07-08 | 北京石油化工学院 | Method for removing noise and hazardous chemical storage laser scanning device |
| CN112771534A (en) * | 2018-06-29 | 2021-05-07 | 物流及供应链多元技术研发中心有限公司 | Electronic device and object counting method |
| JP7167708B2 (en) * | 2018-12-28 | 2022-11-09 | 株式会社アイシン | Distance information generator |
| CN111562592B (en) * | 2019-01-29 | 2024-04-05 | 北京京东乾石科技有限公司 | Object edge detection method, device and storage medium |
| CN113611029B (en) * | 2021-07-22 | 2023-09-26 | 北京京东乾石科技有限公司 | Monitoring method, device, access control system and readable storage medium |
| CN113640771B (en) * | 2021-08-19 | 2024-05-28 | 深圳市中联讯科技有限公司 | Noise filtering method and terminal |
| CN114545437B (en) * | 2022-01-27 | 2025-10-21 | 华南师范大学 | Human intrusion detection method and security system based on laser radar |
| CN116358430B (en) * | 2023-03-15 | 2025-08-15 | 南京极目机器人科技有限公司 | Plant spacing measurement method and system |
| CN118392056B (en) * | 2024-04-24 | 2025-03-28 | 醴陵市浦口电瓷制造有限公司 | Fixed pull rope detection system for insulating rigid ladders installed with power supply network lines |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001014474A (en) * | 1999-07-01 | 2001-01-19 | Matsushita Electric Ind Co Ltd | People extraction method |
| JP2005242488A (en) | 2004-02-24 | 2005-09-08 | Matsushita Electric Works Ltd | Object detecting device, object detecting method and program |
| US20120263353A1 (en) * | 2009-12-25 | 2012-10-18 | Honda Motor Co., Ltd. | Image processing apparatus, image processing method, computer program, and movable body |
| JP2014035302A (en) | 2012-08-09 | 2014-02-24 | Panasonic Corp | Object detection device, object detection method and program |
| JP2014137762A (en) * | 2013-01-18 | 2014-07-28 | Sanyo Electric Co Ltd | Object detector |
| GB2528810A (en) | 2013-03-29 | 2016-02-03 | Denso Corp | Device and method for monitoring moving objects in detection area |
| US20180172830A1 (en) * | 2015-06-24 | 2018-06-21 | Konica Minolta, Inc. | Distance image processing device, distance image processing method, distance image processing program, and recording medium |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP4543904B2 (en) * | 2004-11-30 | 2010-09-15 | パナソニック電工株式会社 | Distance image sensor |
| JP4727388B2 (en) * | 2005-10-28 | 2011-07-20 | セコム株式会社 | Intrusion detection device |
| JP5904069B2 (en) * | 2012-09-13 | 2016-04-13 | オムロン株式会社 | Image processing apparatus, object detection method, and object detection program |
| JP6212400B2 (en) * | 2014-01-29 | 2017-10-11 | セコム株式会社 | Object detection sensor and program |
-
2016
- 2016-10-14 JP JP2016202719A patent/JP6862750B2/en not_active Expired - Fee Related
-
2017
- 2017-09-11 US US15/700,346 patent/US10648795B2/en not_active Expired - Fee Related
- 2017-09-13 EP EP17190812.2A patent/EP3309583B1/en not_active Not-in-force
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2001014474A (en) * | 1999-07-01 | 2001-01-19 | Matsushita Electric Ind Co Ltd | People extraction method |
| JP2005242488A (en) | 2004-02-24 | 2005-09-08 | Matsushita Electric Works Ltd | Object detecting device, object detecting method and program |
| US20120263353A1 (en) * | 2009-12-25 | 2012-10-18 | Honda Motor Co., Ltd. | Image processing apparatus, image processing method, computer program, and movable body |
| JP2014035302A (en) | 2012-08-09 | 2014-02-24 | Panasonic Corp | Object detection device, object detection method and program |
| JP2014137762A (en) * | 2013-01-18 | 2014-07-28 | Sanyo Electric Co Ltd | Object detector |
| GB2528810A (en) | 2013-03-29 | 2016-02-03 | Denso Corp | Device and method for monitoring moving objects in detection area |
| US20160040979A1 (en) * | 2013-03-29 | 2016-02-11 | Denso Wave Incorporated | Apparatus and method for monitoring moving objects in sensing area |
| US20180172830A1 (en) * | 2015-06-24 | 2018-06-21 | Konica Minolta, Inc. | Distance image processing device, distance image processing method, distance image processing program, and recording medium |
Non-Patent Citations (1)
| Title |
|---|
| Extended European Search Report dated Mar. 1, 2018 for corresponding European Patent Application No. 17190812.2, 8 pages. |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6862750B2 (en) | 2021-04-21 |
| US20180106598A1 (en) | 2018-04-19 |
| EP3309583B1 (en) | 2021-10-13 |
| JP2018063221A (en) | 2018-04-19 |
| EP3309583A1 (en) | 2018-04-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10648795B2 (en) | Distance measuring apparatus and distance measuring method | |
| US10140722B2 (en) | Distance measurement apparatus, distance measurement method, and non-transitory computer-readable storage medium | |
| EP2927710B1 (en) | Ranging system, information processing method and program thereof | |
| US20180172830A1 (en) | Distance image processing device, distance image processing method, distance image processing program, and recording medium | |
| US12175699B2 (en) | Confidence determination of 3D point cloud data acquired by LIDAR sensor | |
| JP6772639B2 (en) | Parallax calculation system, mobiles and programs | |
| EP4310549A1 (en) | Sensing system | |
| US12306302B2 (en) | Image processing device, control program, and image processing method | |
| JP2021043838A (en) | Information processing equipment, control methods, programs and storage media | |
| WO2020187677A1 (en) | Lidar device for a vehicle and method for increasing the detection range of a corresponding lidar device | |
| US20240288555A1 (en) | Lidar data processing method | |
| JP2019007744A (en) | Object sensing device, program, and object sensing system | |
| CN119758301A (en) | Radar control method and device, terminal equipment and storage medium | |
| CN114578316B (en) | Method, device and equipment for determining ghost points in point cloud and storage medium | |
| KR101896477B1 (en) | Method and Apparatus for Scanning LiDAR | |
| US20240230399A9 (en) | Information processing apparatus, information processing method, and sensing system | |
| KR20220165678A (en) | Apparatus for LIDAR | |
| JP2023094057A (en) | Information processor, control method, program, and storage medium | |
| EP4400870A2 (en) | Image processing apparatus and image processing system | |
| CN111316119A (en) | Radar simulation method and device | |
| EP4455723A1 (en) | Information processing device, control method, program, and storage medium | |
| US20250208269A1 (en) | Information processing device, information processing method, and program | |
| CN110441783B (en) | Method and device for optical distance measurement | |
| CN116848430A (en) | Distance measurement correction device, distance measurement correction method, distance measurement correction program and distance measurement device | |
| JP2025164090A (en) | Distance image device, distance image correction method, and image processing program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: FUJITSU LIMITED, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:USHIJIMA, SATORU;REEL/FRAME:043811/0911 Effective date: 20170906 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
| ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240512 |