WO2020019345A1 - Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente - Google Patents

Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente Download PDF

Info

Publication number
WO2020019345A1
WO2020019345A1 PCT/CN2018/097659 CN2018097659W WO2020019345A1 WO 2020019345 A1 WO2020019345 A1 WO 2020019345A1 CN 2018097659 W CN2018097659 W CN 2018097659W WO 2020019345 A1 WO2020019345 A1 WO 2020019345A1
Authority
WO
WIPO (PCT)
Prior art keywords
detected object
category
images
speckle
processing device
Prior art date
Application number
PCT/CN2018/097659
Other languages
English (en)
Chinese (zh)
Inventor
王星泽
舒远
Original Assignee
合刃科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 合刃科技(深圳)有限公司 filed Critical 合刃科技(深圳)有限公司
Priority to CN201880067096.XA priority Critical patent/CN111213069B/zh
Priority to PCT/CN2018/097659 priority patent/WO2020019345A1/fr
Publication of WO2020019345A1 publication Critical patent/WO2020019345A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Definitions

  • the present application relates to the field of electronic technology, and in particular, to an obstacle avoidance device and method based on coherent light.
  • auxiliary blind guide devices to ensure the safety of the blind are mainly based on a single sensor hardware, such as ultrasound and infrared, to detect obstacle information, and then prompt the user to avoid collision danger through sound or vibration.
  • the ultrasonic sensor emits ultrasonic waves. When the ultrasonic waves encounter obstacles in the air, they will be reflected back and converted into electrical signals by the ultrasonic receiving probe. It can only be calculated by measuring the time difference between the transmitted sound wave and the received sound wave and multiplying the propagation speed The distance from the launch point to the obstacle.
  • laser and infrared sensors work, they emit laser pulses or infrared light aiming at obstacles.
  • the solution for judging the position and distance of obstacles by using the time difference between signals transmitted and received by the sensor has a single function and poor accuracy, and cannot comprehensively detect environmental information.
  • the embodiments of the present application provide an obstacle avoidance device and method based on coherent light.
  • the embodiments of the present application can comprehensively detect objects in the surrounding environment, improve the accuracy of object recognition, and further improve blind navigation and security monitoring based on the object recognition. Accuracy of the rating system.
  • an embodiment of the present application provides an obstacle avoidance device based on coherent light, including:
  • An ultrasonic sensor a coherent light sensor, a high-speed camera connected to the coherent light sensor, and a processing device connected to both the ultrasonic sensor and the high-speed camera;
  • the ultrasonic sensor is configured to obtain a distance d between the detected object and the obstacle avoidance device, and transmit the distance d to the processing device;
  • the coherent light sensor is configured to emit coherent light to the detected object, receive reflected coherent light, and transmit the reflected coherent light to the high-speed camera;
  • the high-speed camera is configured to obtain n vibrational speckle images based on the reflected coherent light, and the vibrational speckle images are speckle images of the detected object generating vibrations under the stimulation of the ultrasonic wave.
  • N is an integer greater than 1;
  • the processing device is configured to obtain a vibration waveform signal of the detected object according to the n speckle images of the vibration; and determine a category of the detected object according to the vibration waveform signal.
  • the processing device acquiring the vibration waveform signal of the detected object according to the n speckle images of the vibration includes:
  • the processing device acquires M speckle comparison maps according to the n speckle images of vibration; the M is an integer greater than 1 and less than or equal to the n;
  • the processing device performs a clustering operation on the M spotted images according to a K-means clustering algorithm to obtain k clustered images, where k is an integer greater than 1 and less than M;
  • the processing device acquires a vibration waveform signal of the detected object according to the k cluster images.
  • the processing device determining the type of the detected object according to the vibration waveform signal includes:
  • the processing device performs a fast Fourier transform on the vibration waveform signal to obtain a vibration spectrum of the detected object
  • the processing device inputs the vibration frequency spectrum, the distance d, the ultrasonic frequency spectrum and the measurement environment information into an object recognition model and performs a neural network operation to obtain a calculation result;
  • the obstacle avoidance device further includes: an environmental information detection module and a reminder device connected to the processing device;
  • the environmental information detection module is configured to detect and obtain information of the measurement environment, and the information of the measurement environment includes a temperature value, a wind speed value, and a humidity value;
  • the reminding device is used to remind a user of the distance d between the detection object and the type of the detection object.
  • the processing device before the processing device determines the category of the detected object according to the vibration waveform signal, the processing device is further configured to:
  • a correspondence table between the calculation results and the object category is obtained, and the correspondence table between the calculation results and the object category includes a calculation result range and a corresponding object category, and an upper limit of the calculation result range and The lower limit is the maximum and minimum of a set of calculation results corresponding to the object category.
  • an embodiment of the present application provides a method for avoiding obstacles based on coherent light, including:
  • Speckle images of n vibrations of the detected object under the ultrasound stimulation are acquired based on the coherent light; the n is an integer greater than 1.
  • the user is reminded of the distance d from the detected object and the type of the detected object.
  • acquiring the vibration waveform signal of the detected object according to the n speckle images of the vibration includes:
  • M speckle comparison maps according to the n vibrational speckle images; the M is an integer greater than 1 and less than or equal to the n;
  • speckle contrast image p in the Mk speckle contrast images calculate a distance value from each of the initial cluster centers of the k initial cluster centers to obtain k distance values; where the Mk sheets
  • the speckle contrast image is a speckle contrast image among the M speckle contrast images except for the k speckle contrast images that serve as the initial cluster center;
  • the initial clustering center corresponding to the smallest distance value among the k distance values is selected as the cluster described in the speckle contrast image p; according to this method, k clustered images are obtained, where k is greater than 1 and An integer less than said M;
  • a vibration waveform signal of the detected object is obtained.
  • determining the category of the detected object according to the vibration waveform signal includes:
  • the method further includes:
  • Detecting and acquiring information of the measurement environment where the information of the measurement environment includes a temperature value, a wind speed value, and a humidity value;
  • the processing device before the determining the type of the detected object according to the vibration waveform signal, the processing device is further configured to:
  • a correspondence table between the calculation results and the object category is obtained, and the correspondence table between the calculation results and the object category includes a calculation result range and a corresponding object category, and an upper limit of the calculation result range and The lower limit is the maximum and minimum of a set of calculation results corresponding to the object category.
  • an embodiment of the present application further provides a computer storage medium, wherein the computer storage medium may store a program, and when the program is executed, includes part or all of the steps of the method described in the second aspect above
  • the distance d between the detected object and the obstacle avoidance device is obtained by ultrasonic waves; n speckle images of the vibration of the detected object under ultrasonic stimulation are acquired based on coherent light; n vibration speckle images to obtain the vibration waveform signal of the detected object; and determine the type of the detected object based on the vibration waveform signal; remind the user of the distance d between the detected object and the type of the detected object.
  • the embodiments of the present application can comprehensively detect objects in the surrounding environment, improve the accuracy of object recognition, and further improve the accuracy of blind navigation, security monitoring level, and car navigation system based on the object recognition.
  • FIG. 1 is a schematic diagram of an application scenario of an obstacle avoidance device based on coherent light according to an embodiment of the present application
  • Figure 2 is a speckle image of the vibration of the detected object
  • FIG. 3 is a schematic diagram of a vibration waveform obtained from a speckle image
  • FIG. 4 is a vibration waveform diagram and a corresponding spectrum diagram according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an object recognition model according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an obstacle avoidance process based on coherent light according to an embodiment of the present application.
  • FIG. 1 is a schematic diagram of an application scenario of an obstacle avoidance device based on coherent light according to an embodiment of the present application.
  • the application scenario includes: a detected object 10 and an obstacle avoidance device 20.
  • the detected object 10 may be a pedestrian, glass, tree, metal, plastic, or other object.
  • the obstacle avoidance device 20 includes an ultrasonic sensor 201, a coherent light sensor 202, a high-speed camera 203 connected to the coherent light sensor 202, a processing device 204 connected to both the ultrasonic sensor 201 and the high-speed camera 203, and The reminder device 205 is connected to the processing device 204.
  • the above-mentioned ultrasonic sensor 201 includes an ultrasonic transmitter 2012, an ultrasonic receiver 2011, and a first processor 2013 connected to the ultrasonic transmitter 2012 and the ultrasonic receiver 2011.
  • the ultrasonic transmitter 2012 transmits ultrasonic waves to the object 10 to be detected, and the ultrasonic receiver 2011 receives ultrasonic waves reflected by the object 10 to be detected.
  • the first processor 2013 determines the flying time of the ultrasonic wave based on the time when the ultrasonic transmitter 2012 transmits the ultrasonic wave and the ultrasonic receiver 2011 receives the reflected ultrasonic wave, and then determines the detected object 10 and the ultrasonic wave based on the ultrasonic flight time and the ultrasonic speed.
  • the first processor 2013 includes a communication unit, and the first processor 2013 sends the distance d to the processing device 204 through the communication unit.
  • the above-mentioned coherent light sensor 202 includes a coherent light transmitter 2022, a coherent light receiver 2021, a first lens 2023, and a second lens 2024.
  • the coherent light emitter 2022 generates incident coherent light, and the incident coherent light is irradiated onto the object 10 to be detected through the second lens.
  • the coherent light receiver receives coherent light reflected by the detected object 10 and passing through the first lens 2023, that is, reflected coherent light.
  • the coherent light receiver 2021 transmits the reflected coherent light to the high-speed camera 203 after receiving the reflected coherent light.
  • the high-speed camera 203 obtains n speckle images of the detected object according to the received coherent light, and transmits the n speckle images to the processing device 204, where n is an integer greater than 1.
  • a speckle image of the vibration of the detected object 10 can be obtained according to the coherent light emitted in the same direction as the above-mentioned ultrasonic emission direction, according to the reflected or scattered coherent light; and then, according to the obtained n pieces of the detected object at the same interval, 10, a speckle image of vibration, to obtain a vibration waveform signal of the vibrating object 10 vibrating under the action of the ultrasonic wave.
  • the vibration waveform signals of different detected objects are different, so they can The type of the detected object is determined according to the vibration waveform signal of the detected object.
  • the above-mentioned ultrasonic sensor 2041 and the coherent light sensor 202 start to work at the same time, and the transmission angle is the same. That is, the above-mentioned ultrasonic sensor 201 transmits ultrasonic waves to the detected object 10 and receives reflected ultrasonic waves. The detected object 10 emits coherent light and receives reflected coherent light.
  • the time at which the coherent light transmitter emits coherent light lags behind the time at which the ultrasound transmitter emits ultrasound, so that the ultrasound emitted by the ultrasound transmitter is reflected by the detected object.
  • the above-mentioned coherent light transmitter starts to emit coherent light.
  • the processing device 204 obtains n speckle images of the vibration of the detected object 10 according to the above method, and then according to the time interval between any two adjacent speckle images in the n vibration speckle images Is ⁇ t.
  • the speckle image includes a plurality of spots, and the collection times of the four speckle images from left to right are t, t + ⁇ t, t + 2 ⁇ t, and t + 3 ⁇ t, respectively.
  • the speckle position in the speckle image acquired at different times will change.
  • the processing device 204 determines the information on the change of the spot with time displacement in the speckle image within the time period n ⁇ t according to the n speckle images of vibration, and further obtains the detected object based on the information about the change of the spot with time displacement in the speckle image.
  • the processing device 204 acquires n speckle images according to the foregoing method, and the speckle images are speckle images of vibrations of the detected object.
  • the time interval between the collection times of any two adjacent speckle images in the above n speckle images is ⁇ t, as shown in FIG. 3 a.
  • the processing device 204 obtains M speckle contrast images according to the n speckle images, as shown in FIG. 3B, where M is an integer greater than 1 and less than or equal to n.
  • the processing device 204 performs clustering operation on the M spot contrast images according to the K-means clustering algorithm to obtain k clustered images.
  • the above k is an integer greater than 1 and less than M. .
  • the processing device 204 arbitrarily selects k speckle contrast images from the M speckle contrast images as k initial cluster centers; and then the processing device 204 calculates each speckle contrast image in the Mk speckle comparison images.
  • any speckle contrast image p in the above M-k speckle contrast images after performing the above calculation, k distance values are obtained, and each distance value corresponds to an initial clustering center.
  • the processing device 204 selects the initial clustering center corresponding to the smallest distance value among the k distance values as the cluster to which the blob contrast image belongs.
  • the above processing device 204 obtains k cluster images, as shown in FIG. 3C.
  • the processing device 13 obtains a vibration waveform signal of the detected object based on the k cluster images, as shown in FIG. 4A.
  • the processing device 204 After obtaining the vibration waveform signal of the detected object 10, the processing device 204 performs a fast Fourier transform on the vibration waveform signal to obtain the vibration spectrum of the detected object, as shown in FIG. 4B.
  • the vibration spectrum contains rich information, such as the ultrasonic signals emitted by the ultrasonic transmitter, the structure and material properties of the object to be detected, and the movement of the obstacle avoidance device itself.
  • FIG. 5A is a vibration spectrum diagram when the detected object is a tree
  • FIG. 5B is a vibration spectrum diagram when the detected object is a pedestrian
  • FIG. 5C is a detected object.
  • FIG. 5D is the vibration spectrum diagram when the detected object is metal
  • FIG. 5E is the vibration spectrum diagram when the detected object is plastic.
  • the obstacle avoidance device 20 further includes an environmental information detection module, which is configured to detect information of a current measurement environment. After the environmental information detection module acquires the information of the measurement environment, the measurement environment information is transmitted to the processing device 204.
  • the environmental information measurement module includes sensors such as a temperature sensor, a wind speed sensor, and a humidity sensor, and the measurement environment information includes temperature, wind speed, and humidity values.
  • the processing device 204 After the processing device 204 obtains the vibration spectrum of the detected object 10, the processing device 204 converts the vibration spectrum of the detected object, the distance d between the detected object and the obstacle avoidance device 20, and the ultrasonic wave generated by the ultrasonic transmitter.
  • the corresponding ultrasonic spectrum and the information of the above-mentioned measurement environment are input into an object recognition model for neural network operation.
  • the object recognition model is a neural network model.
  • the object recognition model is used to perform a neural network operation on the information of the vibration spectrum, the ultrasonic spectrum, the distance d, and the measurement environment to obtain at least one calculation result, and each calculation result corresponds to an object type.
  • the processing device 204 may according to the calculation result, that is, The type of the detected object can be determined. As shown in FIG.
  • the above object recognition model includes an input layer, an intermediate layer, and an output layer; after the information of the vibration spectrum, the ultrasonic spectrum, the distance d, and the measurement environment is input from the input layer and after the intermediate layer operation, the output layer can output
  • the five kinds of calculation results include a first calculation result, a second calculation result, a third calculation result, a fourth calculation result, and a fifth calculation result, and the corresponding object categories are trees, pedestrians, glass, metal, and plastic.
  • the output layer of the object recognition model may output any one or any combination of the four calculation results, that is, the object category output by the object recognition model may be any of trees, pedestrians, glass, metal, and plastic. Species; or any combination.
  • processing device 204 determining the type of the detected object according to the calculation result includes:
  • the processing device 204 determines an object type corresponding to the calculation result according to a correspondence table between the calculation result and the object type.
  • the processing device 204 determines that the type of the detected object is a tree; when the settlement result is greater than a2 and is less than or equal to a3, the processing device 13 determines that the type of the detected object is Pedestrians; when the settlement result is greater than a3 and less than or equal to a4, the processing device 13 determines that the type of the detected object is glass; when the settlement result is greater than a4 and less than a5, the processing device 13 determines that the type of the detected object is Metal; when the settlement result is greater than a5 and less than or equal to a6, the processing device 204 determines that the type of the detected object is plastic.
  • the processing device 204 extracts a frequency intensity distribution corresponding to a feature vector characterizing the detected object from the vibration spectrum.
  • the vector includes the material and internal structure of the detected object.
  • the processing device 204 inputs the extracted frequency intensity distribution (that is, a part of the vibration spectrum) into the object recognition model.
  • the processing device 204 before the processing device 204 inputs the information of the vibration spectrum, the ultrasonic spectrum, the distance d, and the measurement environment into the object recognition model, the processing device 204 obtains multiple sets of training data, and the multiple sets Each set of training data in the training data corresponds to a type of object.
  • the corresponding object type is object O
  • the training data i includes the vibration spectrum, ultrasonic spectrum of the object O, between the obstacle avoidance device 20 and the object O.
  • Distance and measurement environment information The processing device 204 performs a neural network operation on the plurality of sets of training data to obtain the object recognition model.
  • the processing device 204 performs a neural network operation on the plurality of sets of training data to obtain the object recognition model
  • the plurality of sets of training data is input into the object recognition model to obtain a plurality of sets of calculation results.
  • Each group of calculation results corresponds to an object category; each group of calculations includes at least two calculation results; according to the above-mentioned multiple groups of calculation results, a correspondence table between the calculation results and the object categories is obtained, and the correspondence between the calculation results and the object categories
  • the table includes the calculation result range and the corresponding object category, and the upper and lower limits of the calculation result range are the maximum and minimum values of a set of calculation results corresponding to the object category, respectively.
  • the processing device 204 before the processing device 204 inputs the information of the vibration spectrum, the ultrasonic spectrum, the distance d, and the measurement environment into the object recognition model, the processing device 204 further includes a communication module, and the processing device 204 A request message is sent to the third-party server through the communication module, and the request message is used to request to obtain the object recognition model and the correspondence table between the calculation result and the object category.
  • the communication module of the processing device 204 receives a response message sent by the third-party server for responding to the request message, and the response message carries the object recognition model and a correspondence table between the calculation result and the object type.
  • the processing device 204 is further configured to retrain the object recognition model and update the correspondence table between the calculation result and the object category to ensure the accuracy of the object recognition model, as follows:
  • the above-mentioned retraining of the object recognition model and the update of the correspondence table between the calculation result and the object category may be performed by the third-party server.
  • the processing device 204 uses the object recognition model and the calculation result and the object category, After performing the object recognition N times in the correspondence relationship table, a request message is re-sent to the third-party server for requesting to obtain a correspondence table between the retrained object recognition model and the updated calculation result and the object category.
  • the processing device 204 After the processing device 204 determines the type of the detected object 10, the processing device 204 transmits the distance d and the type of the detected object to the reminder device 205, and the reminder device 205 sends out a voice message to inform the user in advance Object information, including the type of object and the distance between the object and the user.
  • the processing device after the processing device obtains the vibration speckle image of the detected object 10, the processing device sends the vibration speckle image to the third-party device, and the third-party device determines the subject according to the vibration speckle image.
  • the type of the detected object refer to the related description of the processing device 204 above.
  • the three-party device sends the type of the detected object to the processing device 204.
  • the third-party device may be a smart phone, a smart watch, a smart bracelet, a notebook computer, a desktop computer, or other devices.
  • the obstacle avoidance device 20 can obtain relevant information of the user's surrounding environment according to the above-mentioned related description, including relative position information of surrounding pedestrians, trees, buildings, and vehicles, so that the user can avoid the obstacles more flexibly. Above obstacles.
  • the obstacle avoidance device 20 includes a rotation structure, and the rotation mechanism can realize a 360-degree rotation of the obstacle avoidance device, thereby achieving classification and recognition of objects in the entire scene, which is suitable for panoramic security monitor.
  • the rotation structure is fixedly connected to the ultrasonic sensor 201 and the coherent light sensor 202 in the obstacle avoidance device 20 to achieve synchronous rotation of the ultrasonic sensor 201 and the coherent light sensor 202, and the rotation angle range is 0-360. Degrees to achieve surveillance within a panoramic range.
  • the obstacle avoidance device 20 is used in combination with a security system, for example, when the obstacle avoidance device 20 detects a pedestrian in a preset detection area, such as 23:00 to 5:00, the security system reports to the security The personnel sends alarm information to inform the security personnel that there is an abnormality in the preset detection area, and the alarm information carries the position information of the preset detection area. The security personnel can further check the preset detection area according to the position information of the preset detection area.
  • the obstacle avoidance device 20 may be applied to a car navigation system.
  • the obstacle avoidance device may obtain the surrounding objects (including pedestrians, trees, Buildings, vehicles, etc.) location information, the above-mentioned obstacle avoidance device 20 can perform real-time planning of the route according to the destination information of the user and the location information of the surrounding objects of the current location, and the road condition information between the current location and the destination.
  • the distance d between the detected object and the obstacle avoidance device is obtained by ultrasonic waves; n speckle images of the vibration of the detected object under ultrasonic stimulation are acquired based on coherent light; n vibration speckle images to obtain the vibration waveform signal of the detected object; and determine the type of the detected object based on the vibration waveform signal; remind the user of the distance d between the detected object and the type of the detected object.
  • the embodiments of the present application have the following advantages: 1.
  • the frequency spectrum of the signal is capable of classifying and identifying objects with information about the objects themselves.
  • the non-imaging detection method is adopted. The structure and material information of the object is reflected in the speckle image. Compared with the traditional optical imaging detection method, there is no need to design very complicated lighting and imaging optics, which is especially suitable for scenes in dark environments Recognition, as well as the use of transparent or highly reflective scenes where imaging is difficult.
  • the embodiments of the present application can comprehensively detect obstacles in the surrounding environment, and improve the accuracy of blind navigation.
  • This type of scene object recognition is required in many industries, such as scene reconstruction recognition in autonomous driving or assisted driving, scene monitoring in the security field, and equipment operating status monitoring in industrial production. This method solves applications such as no lighting or poor lighting, transparent glass detection, etc.
  • FIG. 7 is a schematic flowchart of a method for avoiding obstacles based on coherent light according to an embodiment of the present application. As shown in Figure 7,
  • the obstacle avoidance device obtains a distance d between the detected object and the obstacle avoidance device through ultrasonic waves.
  • the obstacle avoidance device acquires n speckle images of the vibration of the detected object under the ultrasound stimulation based on coherent light; the n is an integer greater than 1.
  • the obstacle avoidance device acquires a vibration waveform signal of the detected object according to the n speckle images of the vibration; and determines a category of the detected object according to the vibration waveform signal.
  • acquiring the vibration waveform signal of the detected object according to the n speckle images of the vibration includes:
  • M speckle comparison maps according to the n vibrational speckle images; the M is an integer greater than 1 and less than or equal to the n;
  • speckle contrast image p in the Mk speckle contrast images calculate a distance value from each of the initial cluster centers of the k initial cluster centers to obtain k distance values; where the Mk sheets
  • the speckle contrast image is a speckle contrast image among the M speckle contrast images except for the k speckle contrast images that serve as the initial cluster center;
  • the initial clustering center corresponding to the smallest distance value among the k distance values is selected as the cluster described in the speckle contrast image p; according to this method, k clustered images are obtained, where k is greater than 1 and An integer less than said M;
  • a vibration waveform signal of the detected object is obtained.
  • determining the category of the detected object according to the vibration waveform signal includes:
  • the method further includes:
  • Detecting and acquiring information of the measurement environment where the information of the measurement environment includes a temperature value, a wind speed value, and a humidity value;
  • the processing device before the determining the type of the detected object according to the vibration waveform signal, the processing device is further configured to:
  • a correspondence table between the calculation results and the object category is obtained, and the correspondence table between the calculation results and the object category includes a calculation result range and a corresponding object category, and an upper limit of the calculation result range and The lower limit is the maximum and minimum of a set of calculation results corresponding to the object category.
  • the obstacle avoidance device reminds the user of the distance d from the detected object and the type of the detected object.
  • An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a program, and when the program is executed, includes part or all steps of any one of the obstacle avoidance methods described in the foregoing method embodiments.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif et un procédé d'évitement d'obstacle à base de lumière cohérente. Le procédé consiste à : obtenir, au moyen d'ultrasons, la distance d entre un objet détecté (10) et un dispositif d'évitement d'obstacle (20) (S701) ; obtenir, sur la base de la lumière cohérente, n images de granularité vibrante de l'objet détecté (10) sous la stimulation ultrasonore (S702) ; obtenir, en fonction des n images de granularité vibrante, un signal de forme d'onde de vibration de l'objet détecté (10) ; et déterminer, en fonction du signal de forme d'onde de vibration, la catégorie de l'objet détecté (10) (S703) ; et informer un utilisateur de la distance d entre l'utilisateur et l'objet détecté (10) et sa catégorie (S704). Le procédé peut détecter de manière complète l'objet dans l'environnement et améliore la précision de reconnaissance d'objet, ce qui permet d'améliorer la précision de la navigation basée sur la reconnaissance d'objet pour les personnes aveugles, le niveau de surveillance de sécurité et le système de navigation automobile.
PCT/CN2018/097659 2018-07-27 2018-07-27 Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente WO2020019345A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880067096.XA CN111213069B (zh) 2018-07-27 2018-07-27 基于相干光的避障装置及方法
PCT/CN2018/097659 WO2020019345A1 (fr) 2018-07-27 2018-07-27 Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/097659 WO2020019345A1 (fr) 2018-07-27 2018-07-27 Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente

Publications (1)

Publication Number Publication Date
WO2020019345A1 true WO2020019345A1 (fr) 2020-01-30

Family

ID=69181222

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/097659 WO2020019345A1 (fr) 2018-07-27 2018-07-27 Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente

Country Status (2)

Country Link
CN (1) CN111213069B (fr)
WO (1) WO2020019345A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113970552A (zh) * 2021-09-26 2022-01-25 北京京仪仪器仪表研究总院有限公司 一种结合激光散斑和Kmeans聚类算法的苹果无损检测方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100259371A1 (en) * 2009-04-10 2010-10-14 Jui-Hung Wu Bird-View Parking Aid Apparatus with Ultrasonic Obstacle Marking and Method of Maneuvering the same
CN205910594U (zh) * 2016-07-07 2017-01-25 南方电网科学研究院有限责任公司 一种无人机避障装置
CN206147345U (zh) * 2016-10-18 2017-05-03 山东农业大学 一种多旋翼无人机实时测距和视觉避障系统
CN106817577A (zh) * 2016-11-23 2017-06-09 杭州视氪科技有限公司 一种基于rgb‑d相机和立体声的视障人士障碍物预警眼镜
CN107907483A (zh) * 2017-08-14 2018-04-13 西安电子科技大学 一种基于散射介质的超分辨光谱成像系统及方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6885968B2 (en) * 2000-05-08 2005-04-26 Automotive Technologies International, Inc. Vehicular exterior identification and monitoring system-agricultural product distribution
US6159149A (en) * 1996-03-22 2000-12-12 Lockheed Martin Corporation Ultrasonic camera
US7852462B2 (en) * 2000-05-08 2010-12-14 Automotive Technologies International, Inc. Vehicular component control methods based on blind spot monitoring
JP5672104B2 (ja) * 2011-03-28 2015-02-18 コニカミノルタ株式会社 超音波変調光計測装置および超音波変調光計測方法
CN205607927U (zh) * 2016-05-11 2016-09-28 西安科技大学 超声波场的光学全息测量系统
CN106214437B (zh) * 2016-07-22 2018-05-29 杭州视氪科技有限公司 一种智能盲人辅助眼镜

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100259371A1 (en) * 2009-04-10 2010-10-14 Jui-Hung Wu Bird-View Parking Aid Apparatus with Ultrasonic Obstacle Marking and Method of Maneuvering the same
CN205910594U (zh) * 2016-07-07 2017-01-25 南方电网科学研究院有限责任公司 一种无人机避障装置
CN206147345U (zh) * 2016-10-18 2017-05-03 山东农业大学 一种多旋翼无人机实时测距和视觉避障系统
CN106817577A (zh) * 2016-11-23 2017-06-09 杭州视氪科技有限公司 一种基于rgb‑d相机和立体声的视障人士障碍物预警眼镜
CN107907483A (zh) * 2017-08-14 2018-04-13 西安电子科技大学 一种基于散射介质的超分辨光谱成像系统及方法

Also Published As

Publication number Publication date
CN111213069A (zh) 2020-05-29
CN111213069B (zh) 2023-09-12

Similar Documents

Publication Publication Date Title
US10452923B2 (en) Method and apparatus for integration of detected object identifiers and semantic scene graph networks for captured visual scene behavior estimation
US10740658B2 (en) Object recognition and classification using multiple sensor modalities
CN100527167C (zh) 信息识别装置、信息识别方法、以及警报系统
US7801332B2 (en) Controlling a system based on user behavioral signals detected from a 3D captured image stream
US20200167954A1 (en) Lidar-based multi-person pose estimation
JP2018163096A (ja) 情報処理方法および情報処理装置
US20190138268A1 (en) Sensor Fusion Service to Enhance Human Computer Interactions
Sarabia et al. Accurate estimation of airborne ultrasonic time-of-flight for overlapping echoes
Amin et al. Quality of obstacle distance measurement using ultrasonic sensor and precision of two computer vision-based obstacle detection approaches
CN110136186A (zh) 一种用于移动机器人目标测距的检测目标匹配方法
CN107618036B (zh) 控制装置、控制系统和控制方法
WO2020019345A1 (fr) Dispositif et procédé d'évitement d'obstacle à base de lumière cohérente
US20190187253A1 (en) Systems and methods for improving lidar output
CN109568093A (zh) 一种步行安全综合管理系统及方法
Jin et al. Acoussist: An acoustic assisting tool for people with visual impairments to cross uncontrolled streets
Devnath et al. A systematic study on object recognition using millimeter-wave radar
Saha et al. Visual, navigation and communication aid for visually impaired person
CN115205806A (zh) 生成目标检测模型的方法、装置和自动驾驶车辆
US11675878B1 (en) Auto-labeling method for multimodal safety systems
CN115131756A (zh) 一种目标检测方法及装置
Gupta et al. The Architectural Design of Smart Embedded Blind Stick by Using IOT
US11995766B2 (en) Centralized tracking system with distributed fixed sensors
US20220130109A1 (en) Centralized tracking system with distributed fixed sensors
WO2022217522A1 (fr) Procédé et dispositif de détection de cible, système de détection, plateforme mobile et support de stockage
Avenash A Heterogeneous Sensor Fusion Framework for Obstacle Detection in Piloted UAVs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18927582

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.06.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18927582

Country of ref document: EP

Kind code of ref document: A1