WO2020014832A1 - Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible - Google Patents

Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible Download PDF

Info

Publication number
WO2020014832A1
WO2020014832A1 PCT/CN2018/095824 CN2018095824W WO2020014832A1 WO 2020014832 A1 WO2020014832 A1 WO 2020014832A1 CN 2018095824 W CN2018095824 W CN 2018095824W WO 2020014832 A1 WO2020014832 A1 WO 2020014832A1
Authority
WO
WIPO (PCT)
Prior art keywords
environment
map
image data
period
loading
Prior art date
Application number
PCT/CN2018/095824
Other languages
English (en)
Chinese (zh)
Inventor
易万鑫
廉士国
林义闽
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to CN201880001292.7A priority Critical patent/CN109074408B/zh
Priority to PCT/CN2018/095824 priority patent/WO2020014832A1/fr
Publication of WO2020014832A1 publication Critical patent/WO2020014832A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source

Definitions

  • the present application relates to the field of computer vision, and in particular, to a method, an apparatus, an electronic device, and a readable storage medium for loading a map.
  • an intelligent robot or a driverless vehicle wants to complete some simple or complex functions in an unknown environment, it needs to know the map information of the entire unknown environment. By acquiring the information of the unknown environment, a map of the unknown environment is established so that the intelligent robot or the unmanned vehicle can be located. Only successful mapping and positioning can guarantee the robot's navigation and other functions.
  • VSLAM Visual Simultaneous Localization And Mapping
  • a technical problem to be solved in some embodiments of the present application is how to accurately select a positioning map when the current ambient light changes, thereby improving the success rate and accuracy of positioning images captured in real time.
  • An embodiment of the present application provides a method for loading a map, including: obtaining current image data of an environment; calculating first current lighting information corresponding to the environment according to the current image data of the environment; and determining the current environment according to the first lighting information.
  • An embodiment of the present application further provides a device for loading a map, including: an obtaining module, a first determining module, a second determining module, and a map loading module; the obtaining module is used to obtain the current image data of the environment; the first determining module It is used to calculate the first lighting information currently corresponding to the environment according to the current image data of the environment; the second determination module is used to determine the current time period of the environment according to the first light information; the map loading module is used to determine the current time period of the environment and The correspondence between the time period and the positioning map, determining the positioning map to be loaded and loading the positioning map.
  • An embodiment of the present application further provides an electronic device, including: at least one processor; and a memory communicatively connected to the at least one processor; wherein the memory stores instructions executable by the at least one processor, and the instructions are at least A processor executes the method to enable at least one processor to execute the above-mentioned map loading.
  • An embodiment of the present application further provides a computer-readable storage medium storing a computer program, and the computer program is implemented by a processor to implement the foregoing method for loading a map.
  • the current first lighting information of the environment is used to determine the current time period of the environment, so that a positioning map corresponding to the current time period of the environment can be determined. Because there is a corresponding relationship between the changes in lighting in the environment and the time period in which the environment is located, there is also a corresponding relationship between the positioning map and the time period.
  • the current first light information of the environment can accurately determine the correspondence with the current first light information of the environment Positioning map, so that even if the current lighting of the environment changes, the positioning map can be accurately determined, thereby ensuring that the lighting of the currently loaded positioning map and the environment's lighting in the current period are small, even without lighting
  • the difference improves the success rate and accuracy rate of real-time image positioning.
  • FIG. 1 is a specific flowchart of a map loading method in a first embodiment of the present application
  • FIG. 2 is a schematic flowchart of a specific process of constructing a positioning map of an environment, determining a mapping relationship between the positioning map of the environment and a time period, and determining a correspondence relationship between lighting information of the environment and a time period in the first embodiment of the present application;
  • FIG. 3 is a specific flowchart of a map loading method in a second embodiment of the present application.
  • FIG. 4 is a schematic flowchart of a specific process of constructing a positioning map of an environment and determining a correspondence between the positioning map of the environment and a time period in the second embodiment of the present application;
  • FIG. 5 is a schematic structural diagram of a map loading device in a third embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an electronic device in a fourth embodiment of the present application.
  • the first embodiment of the present application relates to a method for loading a map.
  • the method for loading a map may be applicable to electronic devices that use VSLAM technology to build a map, such as unmanned vehicles, intelligent robots, and the like.
  • the map loading method, the specific process is shown in Figure 1:
  • Step 101 Acquire current image data of the environment.
  • the image data of the environment at the current moment can be obtained by using a sensor.
  • the image data of the environment at the current moment can be obtained by a camera.
  • the image data includes M images, where M is an integer greater than 1, and shooting in multiple images
  • M is an integer greater than 1, and shooting in multiple images
  • the content covers a wider area of the environment. For example, if the environment is an outdoor gymnasium, you can shoot the gymnasium every 1 meter away. If you get 5 images of the gymnasium in the current period (assuming 5 images are captured in a period), you will take 5 images of the gymnasium.
  • a wide-angle lens may be adopted to increase the shooting range of the camera, so that the captured content contained in the captured image is rich, and the type of the camera is not limited in this embodiment.
  • Step 2011 Image data of the environment is collected in each period of N cycles, where N is an integer greater than 0.
  • a day can be divided into m periods, each period contains m periods, and m is an integer greater than 1, where N periods can be consecutive N periods or non-continuous N periods. Cycles, this embodiment is described by taking consecutive N cycles as an example.
  • image data of the environment is collected in the same period of N consecutive cycles, where N is an integer greater than 0; in each period, M images of the environment are collected at preset distance intervals, and M is greater than An integer of 1.
  • the light intensity of the environment changes with time. For example, if the environment to be collected is a road in a park, the light intensity of the road during the day is greater than the light intensity of the road at night.
  • a day can be divided into m periods, m is an integer greater than 1, for example, m is 2, that is, a day is divided into two periods, respectively, daytime and evening periods; and the environment is collected in the same period of N cycles continuously
  • the image data may be, for example, image data of roads in the park collected during the daytime period and image data of roads in the park during the night period.
  • Collecting image data of the environment in the same period of N consecutive cycles improves the accuracy of the correspondence between the period and the location map of the environment, and the accuracy of the correspondence between the period and the lighting information.
  • M images of the environment at preset intervals in each period For example, if you collect image data of a stadium during the daytime period, you can collect an image of the stadium every 1 meter. For the M images in the time period, M can be determined according to the size of the stadium and the angle of view taken by the camera. Of course, for the accuracy of the map to be constructed, the preset distance of collection can be reduced.
  • Step 2012 According to the image data collected at each time period, a positioning map of the environment is constructed and the corresponding relationship between the time period and the positioning map of the environment is determined.
  • the VSLAM technology is used to construct a positioning map of the environment while collecting the image data of the environment, and when the positioning map of the environment is constructed, the correspondence between the positioning map and the time period is established.
  • the image data of the gymnasium is collected at time t1
  • the positioning map constructed using VSLAM technology uses "t1" as the identifier of the constructed positioning map, thereby establishing the correspondence between the time period and the positioning map.
  • t1 the positioning map constructed using VSLAM technology
  • other ways can also be used to establish the correspondence between the location map of the environment and the time period.
  • the form of the correspondence between the time zone and the location map of the environment can also be in other ways, which are no longer in this embodiment List.
  • Step 2013 The second illumination information corresponding to the environment in different periods is calculated according to the image data collected at each period.
  • the illumination information may be a brightness value or a grayscale value of the acquired image.
  • the brightness value is used as the illumination information.
  • the process of calculating the second illumination information corresponding to the environment in a period is: based on the image data collected in the same period of N cycles, determining the corresponding first Three lighting information; according to the image data collected in each period, determine the third lighting information corresponding to the environment in the same period of N cycles; calculate the third lighting information corresponding to the environment in the same period of N cycles An average value; and an average value of the third illumination information is used as the second illumination information corresponding to the environment in the same period.
  • the image data collected in the same period of each cycle is processed as follows: the average value of the fourth illumination information corresponding to each of the M images in the period is calculated; and the average value of the fourth illumination information is used as the environment in one The third illumination information corresponding to the same period of the period.
  • the fourth illumination information corresponding to an image is calculated and determined according to the total number of pixels included in the image and the illumination information of each pixel.
  • f (x, y) represents the lighting information of each pixel of the image, that is, the pixel value of each pixel
  • B is the total number of pixels contained in the image
  • represents a non-zero positive that approaches zero. Numbers, such as 0.001, ⁇ are used to prevent situations where the logarithmic calculation result tends to be negative infinity.
  • the fourth illumination information corresponding to the image P ij can be calculated.
  • h ij represents the lighting information of the environment in the j period of the day with the serial number i, that is, the third lighting information corresponding to the environment in a period of a period.
  • H j represents the second illumination information corresponding to the environment in the same period j.
  • the second illumination information corresponding to the environment in other periods can be calculated and obtained.
  • Step 102 Calculate first illumination information currently corresponding to the environment according to the current image data of the environment.
  • the current image data of the environment includes M images, and the fourth lighting information corresponding to each image in the current image data of the environment is separately calculated; and the current environment of the environment is determined according to the fourth lighting information corresponding to each image. Corresponding first lighting information.
  • the process of calculating the fourth illumination information corresponding to each image in the current image data of the environment is substantially the same as the process of step 2013, that is, each of the current image data of the environment can be calculated by formula 2.1.
  • the fourth lighting information corresponding to the image will not be described in detail.
  • the first illumination information corresponding to the current environment can be obtained by bringing the fourth illumination information of each image in the current image data of the environment into Formula 2.2.
  • Step 103 Determine, according to the first illumination information, a time period in which the environment is currently located.
  • the first illumination information is respectively different from the second illumination information corresponding to the environment in different periods, and the current period of the environment is determined according to the difference result.
  • the first illumination information is respectively different from the second illumination information corresponding to the environment at different periods, and the period corresponding to the smallest difference result is taken as the period in which the environment is currently located.
  • the manner in which the first illumination information is different from the second illumination information corresponding to the environment in a period is as follows:
  • H j represents the second illumination information corresponding to the environment in the j period, Represents the current first illumination information of the environment.
  • Step 104 Determine the positioning map to be loaded and load the positioning map according to the current time period of the environment and the corresponding relationship between the time period and the positioning map.
  • the current first lighting information of the environment is used to determine the current time period of the environment, so that a positioning map corresponding to the current time period of the environment can be determined. Because there is a corresponding relationship between the changes in lighting in the environment and the time period in which the environment is located, there is also a corresponding relationship between the positioning map and the time period.
  • the current first light information of the environment can accurately determine the correspondence with the current first light information of the environment Positioning map, so that even if the current lighting of the environment changes, the positioning map can be accurately determined, thereby ensuring that the lighting of the currently loaded positioning map and the environment's lighting in the current period are small, even without lighting
  • the difference improves the success rate and accuracy rate of real-time image positioning.
  • the second embodiment of the present application relates to a method for loading a map.
  • the second embodiment is substantially the same as the first embodiment, and the main difference is that this embodiment does not need to acquire data according to different time periods before acquiring the current image data of the environment.
  • Image data of the image and calculate second illumination information corresponding to the environment in different periods.
  • This embodiment is applied to an environment with a lighting device. The specific process is shown in FIG. 3:
  • Step 301 Acquire current image data of the environment.
  • this step is substantially the same as step 101 in the first embodiment, and details are not described herein again.
  • Step 4011 Collect image data of the environment in each period of N cycles, where N is an integer greater than 0.
  • Step 4012 According to the image data collected at each time period, a positioning map of the environment is constructed and the corresponding relationship between the time period and the positioning map of the environment is determined.
  • Steps 4011 and 4012 are substantially the same as steps 2011 and 2012 in the first embodiment, and details are not described herein again.
  • Step 302 Identify the lighting devices in the current M images of the environment according to the current M images of the environment.
  • the current image data of the environment includes M images, where M is an integer greater than 1.
  • the deep learning method is used to obtain feature points in the M images and identify the lighting device in each image.
  • the lighting device includes various types of lamps.
  • Step 303 Calculate the lighting device in any of the current M images of the environment or the fifth lighting information within the preset range of the lighting device, and calculate the calculated lighting device or the fifth lighting information within the preset range of the lighting device.
  • the lighting information is used as the first lighting information currently corresponding to the environment.
  • the fifth lighting information of the lighting device in any of the current M images of the environment is calculated, and the calculation method of the fifth lighting information h1 can refer to formula 2.1, such as: Wherein W is the total pixel points included in the lighting device in the image, and f (x, y) represents the pixel value of the pixel points of the lighting device in the image. It can be understood that other ways of calculating the lighting information may also be adopted.
  • the fifth lighting information in the preset range of the lighting device in any of the current M images of the environment is calculated.
  • the preset range of the lighting device may be centered on the center of mass of the lighting device in the image, and the preset The area where the pixel is a radius may also be other areas around the lighting device. For example, a circular area with a center of mass of the fluorescent lamp as the center and a radius of 8 pixels as the preset range of the fluorescent lamp.
  • the calculation method of the fifth lighting information in the preset range of the lighting device is substantially the same as the calculation method of the fifth lighting information of the lighting device, and details are not described herein again.
  • Step 304 Determine whether the lighting device or the fifth lighting information within a preset range of the lighting device exceeds a preset lighting threshold. If yes, go to step 305; otherwise, go to step 306.
  • different periods include a day period and an evening period
  • the preset lighting threshold may be determined according to the lighting information of the lighting device during the day period and the evening period. If the fifth lighting information in the lighting device or the preset range of the lighting device exceeds the preset lighting threshold, it indicates that the lighting device is turned on, and the lighting in the lighting device or the preset range of the lighting device is bright and the brightness value is high. The environment is currently at night. If the fifth lighting information in the lighting device or the preset range of the lighting device does not exceed the preset lighting threshold, it indicates that the lighting device is not turned on, and the lighting in the lighting device or the preset range of the lighting device is dark. , The brightness value is low, it can be determined that the environment is currently in the daytime period.
  • Step 305 Determine that the environment is currently at night.
  • Step 306 Determine that the environment is currently in the daytime period.
  • Step 307 Determine the positioning map to be loaded and load the positioning map according to the current time period of the environment and the corresponding relationship between the time period and the positioning map.
  • the map loading method identifies a lighting device in current image data of an environment, and according to the fifth lighting information of the identified lighting device or the fifth lighting information in a preset range of the lighting device To determine whether the lighting device is turned on, so as to determine the current time period of the environment, and then determine the current positioning map corresponding to the environment.
  • This method only needs to calculate the fifth lighting information of the lighting device or the preset range of the lighting device. The calculation and judgment are simple and the processing speed is fast, so that the location map of the environment can be determined quickly and accurately.
  • the third embodiment of the present application relates to a device 50 for loading a map, including: an obtaining module 501, a first determining module 502, a second determining module 503, and a map loading module 504; the specific structure of the map loading device is shown in FIG. 5 Show:
  • the acquisition module 501 is configured to acquire current image data of the environment; the first determination module 502 is configured to calculate first lighting information currently corresponding to the environment according to the current image data of the environment; the second determination module 503 is configured to determine according to the first lighting information The current time period of the environment; the map loading module 504 is configured to determine the positioning map to be loaded and load the positioning map according to the current time period of the environment and the corresponding relationship between the time period and the positioning map.
  • This embodiment is an embodiment of a virtual device corresponding to the above-mentioned map loading method.
  • the technical details in the above-mentioned method embodiment are still applicable in this embodiment, and details are not described herein again.
  • a fourth embodiment of the present application relates to an electronic device, whose structure is shown in FIG. 6. It includes: at least one processor 601; and a memory 602 communicatively connected to the at least one processor 601.
  • the memory 602 stores instructions executable by at least one processor 601. The instructions are executed by the at least one processor 601, so that the at least one processor 601 can execute the above-mentioned map loading method.
  • the processor uses a central processing unit (CPU) as an example
  • the memory uses a readable and writable memory (Random Access Memory) as an example.
  • the processor and the memory may be connected through a bus or other methods. In FIG. 6, the connection through the bus is taken as an example.
  • the memory as a non-volatile computer-readable storage medium, can be used to store non-volatile software programs, non-volatile computer executable programs, and modules.
  • the positioning map is stored in the memory as in the embodiment of the present application.
  • the processor executes various functional applications and data processing of the device by running non-volatile software programs, instructions, and modules stored in the memory, that is, the method for loading the map described above.
  • the memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system and an application program required for at least one function; the storage data area may store a list of options and the like.
  • the memory may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory may optionally include a memory remotely set with respect to the processor, and these remote memories may be connected to an external device through a network. Examples of the above network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • One or more modules are stored in the memory, and when executed by one or more processors, the method for loading a map in any of the foregoing method embodiments is executed.
  • the above products can execute the map loading method provided in the embodiment of the present application, and have the corresponding functional modules and beneficial effects of the execution method.
  • the map loading provided in the embodiment of the present application. Methods.
  • a fifth embodiment of the present application relates to a computer-readable storage medium.
  • the readable storage medium is a computer-readable storage medium, and the computer-readable storage medium stores computer instructions that enable a computer to execute the first The method for loading a map involved in the first or second method embodiment.
  • the program is stored in a storage medium and includes several instructions for making a device ( It may be a single-chip microcomputer, a chip, or the like) or a processor to perform all or part of the steps of the method described in each embodiment of the present application.
  • the foregoing storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks and other media that can store program codes .

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Navigation (AREA)

Abstract

Un procédé et un dispositif de chargement de carte, un appareil électronique et un support de stockage lisible, le procédé de chargement de carte comprenant les étapes consistant à : acquérir des données d'image actuelles d'un environnement (101); réaliser, en fonction des données d'image actuelles de l'environnement, un calcul pour obtenir des premières informations d'éclairage actuellement correspondant à l'environnement (102); déterminer, en fonction des premières informations d'éclairage, un segment de temps actuel de l'environnement (103); et déterminer une carte de positionnement à charger selon le segment de temps et une correspondance entre le segment de temps et la carte de positionnement, et charger la carte de positionnement (104). Le procédé de chargement de carte permet une sélection précise d'une carte de positionnement lorsque l'éclairage actuel d'un environnement change, ce qui permet d'augmenter le taux de réussite et la précision de positionnement à l'aide d'une image capturée en temps réel.
PCT/CN2018/095824 2018-07-16 2018-07-16 Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible WO2020014832A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880001292.7A CN109074408B (zh) 2018-07-16 2018-07-16 一种地图加载的方法、装置、电子设备和可读存储介质
PCT/CN2018/095824 WO2020014832A1 (fr) 2018-07-16 2018-07-16 Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/095824 WO2020014832A1 (fr) 2018-07-16 2018-07-16 Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible

Publications (1)

Publication Number Publication Date
WO2020014832A1 true WO2020014832A1 (fr) 2020-01-23

Family

ID=64789328

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/095824 WO2020014832A1 (fr) 2018-07-16 2018-07-16 Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible

Country Status (2)

Country Link
CN (1) CN109074408B (fr)
WO (1) WO2020014832A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112697156A (zh) * 2020-12-04 2021-04-23 深圳市优必选科技股份有限公司 地图库建立方法、机器人、计算机设备及存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110211027A (zh) * 2019-04-30 2019-09-06 北京云迹科技有限公司 用于机器人的地图数据处理方法及装置
CN111652934B (zh) * 2020-05-12 2023-04-18 Oppo广东移动通信有限公司 定位方法及地图构建方法、装置、设备、存储介质
WO2022116156A1 (fr) * 2020-12-04 2022-06-09 深圳市优必选科技股份有限公司 Procédé de positionnement visuel, robot et support de stockage
CN112488007B (zh) * 2020-12-04 2023-10-13 深圳市优必选科技股份有限公司 视觉定位方法、装置、机器人及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102596517A (zh) * 2009-07-28 2012-07-18 悠进机器人股份公司 移动机器人定位和导航控制方法及使用该方法的移动机器人
US20130010066A1 (en) * 2011-07-05 2013-01-10 Microsoft Corporation Night vision
US20160253806A1 (en) * 2015-02-27 2016-09-01 Hitachi, Ltd. Self-Localization Device and Movable Body
CN106125730A (zh) * 2016-07-10 2016-11-16 北京工业大学 一种基于鼠脑海马空间细胞的机器人导航地图构建方法
CN106537186A (zh) * 2014-11-26 2017-03-22 艾罗伯特公司 用于使用机器视觉系统执行同时定位和映射的系统和方法
CN106767750A (zh) * 2016-11-18 2017-05-31 北京光年无限科技有限公司 一种用于智能机器人的导航方法及系统

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6288721B1 (en) * 1999-07-07 2001-09-11 Litton Systems, Inc. Rendering process and method for digital map illumination intensity shading
TWI391874B (zh) * 2009-11-24 2013-04-01 Ind Tech Res Inst 地圖建置方法與裝置以及利用該地圖的定位方法
US20150187127A1 (en) * 2012-07-19 2015-07-02 Google Inc. Varying map content and styles based on time
US20150193971A1 (en) * 2014-01-03 2015-07-09 Motorola Mobility Llc Methods and Systems for Generating a Map including Sparse and Dense Mapping Information
US9971320B2 (en) * 2014-07-03 2018-05-15 Google Llc Methods and systems for adaptive triggering of data collection
WO2017022401A1 (fr) * 2015-08-04 2017-02-09 ヤマハ発動機株式会社 Système de fourniture d'informations
JP6849330B2 (ja) * 2015-08-28 2021-03-24 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
US20170287196A1 (en) * 2016-04-01 2017-10-05 Microsoft Technology Licensing, Llc Generating photorealistic sky in computer generated animation
WO2018098811A1 (fr) * 2016-12-02 2018-06-07 深圳前海达闼云端智能科技有限公司 Procédé et dispositif de localisation
CN107945224A (zh) * 2017-11-07 2018-04-20 北京中科慧眼科技有限公司 基于图像检测光照条件的方法与装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102596517A (zh) * 2009-07-28 2012-07-18 悠进机器人股份公司 移动机器人定位和导航控制方法及使用该方法的移动机器人
US20130010066A1 (en) * 2011-07-05 2013-01-10 Microsoft Corporation Night vision
CN106537186A (zh) * 2014-11-26 2017-03-22 艾罗伯特公司 用于使用机器视觉系统执行同时定位和映射的系统和方法
US20160253806A1 (en) * 2015-02-27 2016-09-01 Hitachi, Ltd. Self-Localization Device and Movable Body
CN106125730A (zh) * 2016-07-10 2016-11-16 北京工业大学 一种基于鼠脑海马空间细胞的机器人导航地图构建方法
CN106767750A (zh) * 2016-11-18 2017-05-31 北京光年无限科技有限公司 一种用于智能机器人的导航方法及系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112697156A (zh) * 2020-12-04 2021-04-23 深圳市优必选科技股份有限公司 地图库建立方法、机器人、计算机设备及存储介质

Also Published As

Publication number Publication date
CN109074408B (zh) 2022-04-08
CN109074408A (zh) 2018-12-21

Similar Documents

Publication Publication Date Title
WO2020014832A1 (fr) Procédé et dispositif de chargement de carte, appareil électronique et support d'informations lisible
Toft et al. Semantic match consistency for long-term visual localization
CN109509230B (zh) 一种应用于多镜头组合式全景相机的slam方法
CN109087349B (zh) 一种单目深度估计方法、装置、终端和存储介质
WO2020052530A1 (fr) Procédé et dispositif de traitement d'image et appareil associé
CN109520500B (zh) 一种基于终端拍摄图像匹配的精确定位及街景库采集方法
CN113674416B (zh) 三维地图的构建方法、装置、电子设备及存储介质
WO2020103108A1 (fr) Procédé et dispositif de génération de sémantique, drone et support d'informations
CN110853085B (zh) 基于语义slam的建图方法和装置及电子设备
WO2021017211A1 (fr) Procédé et dispositif de positionnement de véhicule utilisant la détection visuelle, et terminal monté sur un véhicule
CN113935428A (zh) 基于图像识别的三维点云聚类识别方法及系统
US20220074743A1 (en) Aerial survey method, aircraft, and storage medium
CN111582255A (zh) 车辆超限检测方法、装置、计算机设备和存储介质
CN115512251A (zh) 基于双分支渐进式特征增强的无人机低照度目标跟踪方法
WO2023284358A1 (fr) Procédé et appareil d'étalonnage de caméra, dispositif électronique et support de stockage
CN115546681A (zh) 一种基于事件和帧的异步特征跟踪方法和系统
CN112418031A (zh) 图像识别方法和装置、存储介质及电子设备
WO2020019239A1 (fr) Procédé et dispositif de positionnement, terminal et support de stockage lisible
CN110880003B (zh) 一种图像匹配方法、装置、存储介质及汽车
CN116777966A (zh) 一种农田路面环境下车辆航向角的计算方法
CN111596090A (zh) 车辆行驶速度的测量方法、装置、车辆和介质
CN111738085A (zh) 实现自动驾驶同时定位与建图的系统构建方法及装置
JP2023523364A (ja) 視覚測位方法、装置、機器及び可読記憶媒体
CN114820748B (zh) 基于全景数据的全视角特征的智能取证方法
CN112101177A (zh) 地图构建方法、装置及运载工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18926831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14/05/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18926831

Country of ref document: EP

Kind code of ref document: A1