WO2018103024A1 - 智能导盲方法和装置 - Google Patents
智能导盲方法和装置 Download PDFInfo
- Publication number
- WO2018103024A1 WO2018103024A1 PCT/CN2016/108928 CN2016108928W WO2018103024A1 WO 2018103024 A1 WO2018103024 A1 WO 2018103024A1 CN 2016108928 W CN2016108928 W CN 2016108928W WO 2018103024 A1 WO2018103024 A1 WO 2018103024A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- confidence
- sensor information
- guide
- information
- intelligent
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F9/00—Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
- A61F9/08—Devices or methods enabling eye-patients to replace direct visual perception by another kind of perception
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H3/00—Appliances for aiding patients or disabled persons to walk about
- A61H3/06—Walking aids for blind persons
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B21/00—Teaching, or communicating with, the blind, deaf or mute
- G09B21/001—Teaching or communicating with blind persons
- G09B21/006—Teaching or communicating with blind persons using audible presentation of the information
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
Definitions
- the invention relates to the field of artificial intelligence, and in particular to a smart guide blind method and device.
- the embodiment of the invention provides a smart guiding method and device, which is mainly used for solving the problem that the intelligent guiding blind system requires manual intervention by the whole customer service, thereby causing a large work intensity.
- a smart guide blind method comprising:
- the confidence of the smart guide blind indicating a reliability of the guide information generated by processing the sensor information by using an artificial intelligence algorithm
- the sensor information is processed by using an artificial intelligence algorithm to generate guide information, and when the confidence level of the smart guide blind is lower than a preset threshold, the trigger is triggered.
- Artificial guide blindness When the confidence level of the smart guide blind is greater than or equal to a preset threshold, the sensor information is processed by using an artificial intelligence algorithm to generate guide information, and when the confidence level of the smart guide blind is lower than a preset threshold, the trigger is triggered. Artificial guide blindness.
- an intelligent guide blind device comprising:
- a calculating unit configured to obtain a confidence level of the smart guide blind according to the sensor information, where the confidence level of the smart guide blind indicates reliability of the guide information generated by processing the sensor information by using an artificial intelligence algorithm;
- a determining unit configured to: when the intelligent guide blind has a confidence greater than or equal to a preset threshold
- the artificial intelligence algorithm is used to process the sensor information to generate guide information, and when the confidence level of the smart guide is lower than a preset threshold, artificial guide blind is triggered.
- a computer storage medium for storing computer software instructions for use as a smart guide blind device, comprising program code designed to perform the intelligent guide blind method of the first aspect.
- a computer program product which can be directly loaded into an internal memory of a computer and contains software code, and the computer program can be loaded and executed by a computer to implement the first aspect. Intelligent guide blind method.
- a fifth aspect provides a memory, a communication interface, and a processor, wherein the memory is used to store computer execution code, and the processor is configured to execute the computer to perform code control to perform the intelligent guide method according to the first aspect.
- the communication interface is used for data transmission between the server and an external device.
- the intelligent guide blind method and device disclosed in the embodiments of the present invention can obtain the confidence of the intelligent guide blind according to the sensor information, and the high-confidence guide can realize the high-reliability guide blind when the confidence is high, and the intelligent guide blind can be realized only by the intelligent guide blind.
- the function solves the problem that the intelligent guide blind system requires manual intervention by the whole customer service, resulting in a large work intensity. At the same time, the task failure caused by the error of the intelligent system is reduced.
- FIG. 1 is a schematic structural diagram of an intelligent guide blind system according to an embodiment of the present invention.
- FIG. 2 is a schematic flowchart of a smart guide blind method according to an embodiment of the present invention.
- FIG. 3 is a schematic diagram of a confidence level of obstacle avoidance success according to an embodiment of the present invention.
- FIG. 4 is a schematic structural diagram of an intelligent guide blind device according to an embodiment of the present invention.
- FIG. 5 is a schematic structural diagram of another intelligent guide blind device according to an embodiment of the present invention.
- FIG. 6 is a schematic structural diagram of still another intelligent guide blind device according to an embodiment of the present invention.
- the embodiment of the present invention provides an intelligent guide blind system.
- the present invention includes: a server 1 and a corresponding display device 2 located in the cloud, and a terminal 3 located at the site.
- the server 1 includes a smart guide device 11 , which may be a smart device (for example, a mobile phone, a helmet, etc.) that incorporates information collection and presentation, which may include an information collection device 31 and a guide blind execution device 32, depending on the actual application scenario.
- the information collecting device 31 may be, for example, a visual sensor, an ultrasonic sensor, or the like for collecting sensor information;
- the guide blind executing device 32 may be a device for performing a blinding action for a sound player, a tactile feedback device, or the like.
- the information collecting device 31 collects the sensor information and sends it to the server 1 through wired (for example, cable, network cable) or wireless (for example, WIFI, Bluetooth), and displays it on the display device 2, and the smart guiding device 11 of the server 1 is based on
- the guide information is transmitted to the guide performing device 32 of the terminal 3, and the guide blind performing device 32 performs the blind guide action based on the guide information, for example, indicating obstacle avoidance, left turn, stop advance, and the like.
- the intelligent guide blind method and device provided by the embodiments of the present invention evaluate the reliability of the non-human intervention intelligent guide blind according to the acquired sensor information, and use the non-human intervention intelligent guide blind to solve the intelligent guide when the reliability is high.
- Blind systems require manual customer service to intervene throughout the process, resulting in a problem of high work intensity.
- the embodiment of the invention provides a smart guiding method, which is shown in FIG. 2, and includes:
- S101 Obtain a confidence level T of the intelligent guide blind according to the sensor information, and the confidence level of the smart guide blind indicates a guide generated by processing the sensor information by using an artificial intelligence algorithm. The reliability of blind information.
- the sensor information includes, but is not limited to, information about visual, auditory, distance, illumination, and the like applied to the intelligent guide blind.
- the sensor includes the visual sensor and the ultrasonic sensor
- the visual sensor information can be acquired through the visual sensor
- the ultrasonic sensor is acquired through the ultrasonic sensor. information.
- Confidence is a kind of probability, and different evaluation methods, such as similarity, classification probability, etc., can be adopted according to different application scenarios.
- the intelligent guide blind process can be divided into multiple specific modules according to functions.
- the positioning module can include a positioning module, an obstacle avoidance module, and an identification module.
- the positioning module obtains an accurate positioning confidence T L according to the sensor information, and the obstacle is avoided.
- the module obtains the obstacle avoidance success confidence T O according to the sensor information, and the identification module obtains the object recognition confidence T R according to the sensor information, that is, obtain at least two kinds of confidence levels according to the sensor information; positioning accurate confidence, avoiding The barrier success confidence and the object recognition confidence.
- the smart guide can be obtained.
- Other functions for performing intelligent guide blindness may have corresponding degrees of confidence for evaluating the reliability of the corresponding function.
- the indicator can be used to describe whether the feature of the scene is rich, whether it is insufficient light, whether it is occluded, or the like.
- the specific method is to perform feature extraction on the image, such as feature point extraction or edge extraction, to describe the quality of the image texture.
- the edge extraction is similar, and the edge extraction obtains a binary image.
- the area of the gradient response on the image takes a value of 1, and the remaining area is 0.
- the number of pixels with a count of 1 is n, and the number of wide and high pixels of the image is N.
- the indicator can be used to describe the positioning quality of the vSLAM module.
- the specific method is based on the feature point of vSLAM, vSLAM every time
- the positioning result is based on the calculation result of a certain number of feature points, so the more feature points, the more reliable the vSLAM.
- the indicator describes the speed of the camera movement, which is too fast to cause image blur.
- the obstacle avoidance algorithm is based on the depth reconstruction result, and analyzes the size ratio of the passable area in the scene view.
- the specific steps are as follows: firstly, the depth information is used to obtain the depth information, and then the area of the obstacle is segmented, and then the width of the passable area is calculated in the entire field of view (FOV) as the confidence level of the obstacle avoidance, as shown in the figure.
- FOV field of view
- X, Y and L are known according to the result of the depth obstacle division.
- object recognition confidence which is used to describe the result of object recognition, it belongs to the similarity index in pattern recognition.
- the common implementation form is as follows: the camera captures image information, and after feature extraction, x is used to match the trained feature library y.
- the matching metric can be described by the distance between x and y, such as Euclidean distance, Manhattan distance, Chebyshev distance, Minkowski distance, standardized Euclidean distance, Mahalanobis distance, angle cosine and Hamming distance.
- the depth learning module can also be used to perform the calculation of the recognition similarity, and the probability value of each node is comprehensively outputted as the confidence level.
- the module for performing smart guide blind includes a positioning module, an obstacle avoidance module, and an identification module
- the guide information generated according to the sensor information may specifically include any one or more of the following information: the sensor module is based on the sensor information.
- the current position coordinates are generated, and the current obstacle-traveling module generates a current travelable direction according to the sensor information, and the identification module generates a label for the object recognition based on the sensor information.
- the above-mentioned guide information can be fed back to the guide helmet by wire or wirelessly to perform guide blindness.
- the intelligent guide blind method provided by the embodiment of the invention obtains the confidence of the intelligent guide blind according to the sensor information, and the high-confidence guide blind function can be realized only by the intelligent guide blind when the confidence is high, without manual intervention.
- the problem that the intelligent guide blind system requires manual intervention by the whole process leads to a large work intensity. At the same time, the task failure caused by the error of the intelligent system is reduced.
- the embodiment of the present invention may divide the function module of the intelligent guide blind device according to the above method example.
- each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
- the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present invention is schematic, and is only a logical function division, and the actual implementation may have another division manner.
- FIG. 4 is a schematic diagram showing a possible structure of the intelligent guide blind device involved in the foregoing embodiment.
- the smart guide blind device 11 includes: a calculation unit 1111 and a determination unit. 1112.
- Computing unit 1111 is for supporting the intelligent guide blind device to perform the process S101 in FIG. 2; the decision unit 1112 is for supporting the smart guide blind device to perform the process S102 in FIG. 2. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
- FIG. 5 shows a possible structural diagram of the intelligent guide blind device involved in the above embodiment.
- the intelligent guide blind device 11 includes a processing module 1122 and a communication module 1123.
- the processing module 1122 is configured to control and manage the actions of the smart guide blind device.
- the processing module 1122 is configured to support the smart guide blind device to perform the processes S101-S102 in FIG. 2 .
- the communication module 1113 is for supporting communication of the smart guide device with other entities, such as communication with the functional modules or network entities shown in FIG.
- the intelligent guide blind device 11 may further include a storage module 1121 for storing program codes and data of the smart guide blind device.
- the processing module 1122 may be a processor or a controller, for example, may be a central processing unit (CPU), a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (application-specific Integrated circuit (ASIC), field programmable gate array (FPGA) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
- the processor may also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
- the communication module 1123 can be a transceiver, a transceiver circuit, a communication interface, or the like.
- the storage module 1121 can be a memory.
- the intelligent guide device may be the server shown in FIG. 6.
- the server 1 includes a processor 1132, a transceiver 1133, a memory 1131, and a bus 1134.
- the transceiver 1133, the processor 1132, and the memory 1131 are connected to each other through a bus 1134;
- the bus 1134 may be a peripheral component interconnect (PCI) bus or an extended industry. Extended industry standard architecture (EISA) bus, etc.
- PCI peripheral component interconnect
- EISA Extended industry standard architecture
- the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one thick line is shown in Figure 6, but it does not mean that there is only one bus or one type of bus.
- the steps of a method or algorithm described in connection with the present disclosure may be implemented in a hardware, or may be implemented by a processor executing software instructions.
- the embodiment of the present invention further provides a storage medium, which may include a memory 1131 for storing computer software instructions for use as a smart guide blind device, which includes program code designed to execute a human-machine hybrid decision method.
- the software instructions may be composed of corresponding software modules, and the software modules may be stored in a random access memory (RAM), a flash memory, a read only memory (ROM), and an erasable programmable only.
- RAM random access memory
- ROM read only memory
- EEPROM electrically erasable programmable read only memory
- An exemplary storage medium is coupled to the processor to enable the processor to read information from, and write information to, the storage medium.
- the storage medium can also be an integral part of the processor.
- the processor and the storage medium can be located in an ASIC. Additionally, the ASIC can be located in a smart guide blind device.
- the processor and the storage medium can also exist as discrete components in the smart guide device.
- the embodiment of the present invention further provides a computer program, which can be directly loaded into the memory 1131 and contains software code. After the computer program is loaded and executed by the computer, the above-mentioned human-machine hybrid decision-making method can be implemented.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Pain & Pain Management (AREA)
- Epidemiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Rehabilitation Therapy (AREA)
- Human Computer Interaction (AREA)
- Acoustics & Sound (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Ophthalmology & Optometry (AREA)
- Vascular Medicine (AREA)
- Computer Networks & Wireless Communication (AREA)
- Educational Technology (AREA)
- Business, Economics & Management (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Educational Administration (AREA)
- User Interface Of Digital Computer (AREA)
- Rehabilitation Tools (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (15)
- 一种智能导盲方法,其特征在于,包括:根据传感器信息得到智能导盲的置信度,所述智能导盲的置信度指示在使用人工智能算法对所述传感器信息进行处理所生成的导盲信息的可靠性;当所述智能导盲的置信度大于等于预设阈值时,使用人工智能算法对所述传感器信息进行处理以生成导盲信息,当所述智能导盲的置信度低于预设阈值时,触发人工导盲。
- 根据权利要求1所述的方法,其特征在于,所述传感器信息包括超声传感器信息和视觉传感器信息。
- 根据权利要求1或2所述的方法,其特征在于,所述使用人工智能算法所述传感器信息生成导盲信息,包括:根据所述传感器信息生成以下信息中的任意一种或者几种作为导盲信息:当前的位置坐标、当前的可行进方向和物体识别的标签。
- 根据权利要求1-3中任一项所述的方法,其特征在于,所述智能导盲的置信度包括如下置信度中的任意一种:定位准确置信度、避障成功置信度、物体识别置信度。
- 根据权利要求1-3任一项所述的方法,其特征在于,所述根据传感器信息得到智能导盲的置信度,包括:根据所述传感器信息得到至少如下置信度的至少两种;定位准确置信度、避障成功置信度、物体识别置信度;对得到的至少两种置信度进行融合得到所述智能导盲的置信度。
- 根据权利要求4或5所述的方法,其特征在于,获取定位准确置信度,包括:根据如下参数中的一种或者多种获取定位准确置信度:纹理质量、跟踪数量、运动质量;获取避障成功置信度,包括:基于深度的重建结果,分析场景视角中可通行区域的大小比例;根据得到的大小比例确定避障成功 置信度;获取物体识别置信度,包括:提取图像中的特征,并与预先训练好的特征库进行匹配,根据匹配程度确定物体识别置信度。
- 一种智能导盲装置,其特征在于,包括:计算单元,用于根据传感器信息得到智能导盲的置信度,所述智能导盲的置信度指示在使用人工智能算法对所述传感器信息进行处理所生成的导盲信息的可靠性;判决单元,用于当所述智能导盲的置信度大于等于预设阈值时,使用人工智能算法对所述传感器信息进行处理以生成导盲信息,当所述智能导盲的置信度低于预设阈值时,触发人工导盲。
- 根据权利要求7所述的装置,其特征在于,所述传感器信息包括超声传感器信息和视觉传感器信息。
- 根据权利要求7或8所述的装置,其特征在于,所述判决单元具体用于根据所述传感器信息生成以下信息中的任意一种或者几种作为导盲信息:当前的位置坐标、当前的可行进方向和物体识别的标签。
- 根据权利要求7-9中任一项所述的装置,其特征在于,所述智能导盲的置信度包括如下置信度中的任意一种:定位准确置信度、避障成功置信度、物体识别置信度。
- 根据权利要求7-9中任一项所述的装置,其特征在于,所述计算单元具体用于:根据所述传感器信息得到至少如下置信度的至少两种;定位准确置信度、避障成功置信度、物体识别置信度;对得到的至少两种置信度进行融合得到所述智能导盲的置信度。
- 根据权利要求10或11所述的装置,其特征在于,所述计算单元具体用于:根据如下参数中的一种或者多种获取定位准确置信度:纹理质量、跟踪数量、运动质量;基于深度的重建结果,分析场景视角中可通行区域的大小比例; 根据得到的大小比例确定避障成功置信度;提取图像中的特征,并与预先训练好的特征库进行匹配,根据匹配程度确定物体识别置信度。
- 一种计算机存储介质,其特征在于,用于储存为智能导盲装置所用的计算机软件指令,其包含执行权利要求1~6中任一项所述的智能导盲方法所设计的程序代码。
- 一种计算机程序产品,其特征在于,可直接加载到计算机的内部存储器中,并含有软件代码,所述计算机程序经由计算机载入并执行后能够实现权利要求1~6中任一项所述的智能导盲方法。
- 一种服务器,其特征在于,包括:存储器、通信接口和处理器,所述存储器用于存储计算机执行代码,所述处理器用于执行所述计算机执行代码控制执行权利要求1-6任一项所述智能导盲方法,所述通信接口用于所述服务器与外部设备的数据传输。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680006916.5A CN107223046A (zh) | 2016-12-07 | 2016-12-07 | 智能导盲方法和装置 |
JP2019530696A JP2020513627A (ja) | 2016-12-07 | 2016-12-07 | インテリジェント盲導方法および装置 |
PCT/CN2016/108928 WO2018103024A1 (zh) | 2016-12-07 | 2016-12-07 | 智能导盲方法和装置 |
US16/435,255 US10945888B2 (en) | 2016-12-07 | 2019-06-07 | Intelligent blind guide method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2016/108928 WO2018103024A1 (zh) | 2016-12-07 | 2016-12-07 | 智能导盲方法和装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/435,255 Continuation US10945888B2 (en) | 2016-12-07 | 2019-06-07 | Intelligent blind guide method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018103024A1 true WO2018103024A1 (zh) | 2018-06-14 |
Family
ID=59927628
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/108928 WO2018103024A1 (zh) | 2016-12-07 | 2016-12-07 | 智能导盲方法和装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10945888B2 (zh) |
JP (1) | JP2020513627A (zh) |
CN (1) | CN107223046A (zh) |
WO (1) | WO2018103024A1 (zh) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10281983B2 (en) | 2017-09-20 | 2019-05-07 | Alex Hamid Mani | Haptic feedback device and method for providing haptic sensation based on video |
US10275083B2 (en) | 2017-09-20 | 2019-04-30 | Alex Hamid Mani | Assistive device with a refreshable haptic feedback interface |
US10503310B2 (en) * | 2017-09-20 | 2019-12-10 | Alex Hamid Mani | Assistive device for non-visually discerning a three-dimensional (3D) real-world area surrounding a user |
CN108478398B (zh) * | 2018-05-17 | 2023-12-29 | 中兴健康科技有限公司 | 人工导盲系统及导盲方法 |
CN112515928A (zh) * | 2020-11-26 | 2021-03-19 | 苏州中科先进技术研究院有限公司 | 一种智能助盲系统、方法、计算机设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101514903A (zh) * | 2009-03-17 | 2009-08-26 | 北京航空航天大学 | 一种基于高精度定位的智能导盲系统 |
CN103312899A (zh) * | 2013-06-20 | 2013-09-18 | 张家港保税区润桐电子技术研发有限公司 | 一种带有导盲功能的智能手机 |
CN105591882A (zh) * | 2015-12-10 | 2016-05-18 | 北京中科汇联科技股份有限公司 | 一种智能机器人与人混合客服的方法及系统 |
CN106021403A (zh) * | 2016-05-12 | 2016-10-12 | 北京奔影网络科技有限公司 | 客服方法及装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2715831B1 (fr) * | 1994-02-08 | 1996-05-31 | Farcy Rene A | Système autonome de perception tridimensionnelle de l'espace en "noir et blanc" pour non-voyants sur base de profilométrie optique active interfacée tactilement en temps réel. |
US6847892B2 (en) * | 2001-10-29 | 2005-01-25 | Digital Angel Corporation | System for localizing and sensing objects and providing alerts |
CN101518482B (zh) * | 2009-03-18 | 2011-04-27 | 东南大学 | 一种触觉图文显示装置及显示方法 |
US8180146B2 (en) * | 2009-12-22 | 2012-05-15 | The Chinese University Of Hong Kong | Method and apparatus for recognizing and localizing landmarks from an image onto a map |
CN102293709B (zh) * | 2011-06-10 | 2013-02-27 | 深圳典邦科技有限公司 | 一种智能导盲装置 |
KR20120140486A (ko) * | 2011-06-21 | 2012-12-31 | 삼성전자주식회사 | 휴대용 단말기에서 보행 안내 서비스를 제공하기 위한 장치 및 방법 |
CN102385698A (zh) * | 2011-10-26 | 2012-03-21 | 潘承志 | 基于置信度传播的图像处理方法 |
CN203328997U (zh) * | 2013-06-08 | 2013-12-11 | 王辉 | 智能导盲机器人 |
JP2015169505A (ja) * | 2014-03-06 | 2015-09-28 | 三菱電機株式会社 | 歩行者用誘導装置 |
CN105078717A (zh) * | 2014-05-19 | 2015-11-25 | 中兴通讯股份有限公司 | 一种智能导盲方法及设备 |
US9348336B2 (en) * | 2014-09-12 | 2016-05-24 | Toyota Jidosha Kabushiki Kaisha | Robot assistance for detecting, managing, and mitigating risk |
CN105686936B (zh) * | 2016-01-12 | 2017-12-29 | 浙江大学 | 一种基于rgb‐ir相机的声音编码交互系统 |
US9817395B2 (en) * | 2016-03-31 | 2017-11-14 | Toyota Jidosha Kabushiki Kaisha | Autonomous navigation of people using a robot network |
-
2016
- 2016-12-07 JP JP2019530696A patent/JP2020513627A/ja active Pending
- 2016-12-07 WO PCT/CN2016/108928 patent/WO2018103024A1/zh active Application Filing
- 2016-12-07 CN CN201680006916.5A patent/CN107223046A/zh active Pending
-
2019
- 2019-06-07 US US16/435,255 patent/US10945888B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101514903A (zh) * | 2009-03-17 | 2009-08-26 | 北京航空航天大学 | 一种基于高精度定位的智能导盲系统 |
CN103312899A (zh) * | 2013-06-20 | 2013-09-18 | 张家港保税区润桐电子技术研发有限公司 | 一种带有导盲功能的智能手机 |
CN105591882A (zh) * | 2015-12-10 | 2016-05-18 | 北京中科汇联科技股份有限公司 | 一种智能机器人与人混合客服的方法及系统 |
CN106021403A (zh) * | 2016-05-12 | 2016-10-12 | 北京奔影网络科技有限公司 | 客服方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
US10945888B2 (en) | 2021-03-16 |
JP2020513627A (ja) | 2020-05-14 |
CN107223046A (zh) | 2017-09-29 |
US20190290493A1 (en) | 2019-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3627180B1 (en) | Sensor calibration method and device, computer device, medium, and vehicle | |
US10945888B2 (en) | Intelligent blind guide method and apparatus | |
CN108845574B (zh) | 目标识别与追踪方法、装置、设备及介质 | |
WO2022083402A1 (zh) | 障碍物检测方法、装置、计算机设备和存储介质 | |
CN108960163B (zh) | 手势识别方法、装置、设备和存储介质 | |
JP6794436B2 (ja) | 非障害物エリア検出のためのシステムおよび方法 | |
CN106952303B (zh) | 车距检测方法、装置和系统 | |
JP2023508590A (ja) | モバイルの拡張現実におけるきめ細かいレベルの視覚認識 | |
WO2021051601A1 (zh) | 利用Mask R-CNN选择检测框的方法及系统、电子装置及存储介质 | |
US20200209880A1 (en) | Obstacle detection method and apparatus and robot using the same | |
WO2018103023A1 (zh) | 人机混合决策方法和装置 | |
WO2021031954A1 (zh) | 对象数量确定方法、装置、存储介质与电子设备 | |
EP3121791A1 (en) | Method and system for tracking objects | |
CN111126209B (zh) | 车道线检测方法及相关设备 | |
CN113420682A (zh) | 车路协同中目标检测方法、装置和路侧设备 | |
CN111382637A (zh) | 行人检测跟踪方法、装置、终端设备及介质 | |
WO2021098573A1 (zh) | 手部姿态估计方法、装置、设备以及计算机存储介质 | |
KR20210061839A (ko) | 전자 장치 및 그 제어 방법 | |
US11080562B1 (en) | Key point recognition with uncertainty measurement | |
CN114519853A (zh) | 一种基于多模态融合的三维目标检测方法及系统 | |
CN114972492A (zh) | 一种基于鸟瞰图的位姿确定方法、设备和计算机存储介质 | |
CN117058421A (zh) | 基于多头模型的图像检测关键点方法、系统、平台及介质 | |
CN110689556A (zh) | 跟踪方法、装置及智能设备 | |
US11314968B2 (en) | Information processing apparatus, control method, and program | |
CN113516013B (zh) | 目标检测方法、装置、电子设备、路侧设备和云控平台 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16923611 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019530696 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 28/10/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16923611 Country of ref document: EP Kind code of ref document: A1 |