WO2022028110A1 - 自移动设备的地图创建方法、装置、设备及存储介质 - Google Patents
自移动设备的地图创建方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2022028110A1 WO2022028110A1 PCT/CN2021/099723 CN2021099723W WO2022028110A1 WO 2022028110 A1 WO2022028110 A1 WO 2022028110A1 CN 2021099723 W CN2021099723 W CN 2021099723W WO 2022028110 A1 WO2022028110 A1 WO 2022028110A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plane
- area
- self
- feature
- mobile device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000012545 processing Methods 0.000 claims description 20
- 238000000605 extraction Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3807—Creation or updating of map data characterised by the type of data
- G01C21/383—Indoor data
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the method of creating the area map from the mobile device includes: during the movement of the self-mobile device, the area map of the work area is created by controlling the self-mobile device to move along the edge of the work area.
- a second plane connected to the first plane and a plane intersection of the first plane and the second plane are determined.
- a target map is generated based on an area map corresponding to each of the one or more work areas.
- a computer-readable storage medium is provided, and a program is stored in the storage medium, and the program is loaded and executed by the processor to implement the method for creating a map from a mobile device according to the first aspect.
- FIG. 2 is a flowchart of a method for creating a map from a mobile device provided by an embodiment of the present application
- FIG. 4 is a block diagram of an apparatus for creating a map from a mobile device provided by an embodiment of the present application.
- the first image capturing component 110 may be implemented as a camera, a video camera, etc., and the number of the first image capturing components 110 may be one or more, and the type and quantity of the first image capturing components 110 are not limited in this embodiment.
- the target feature is used to indicate the first plane directly above the mobile device.
- the target features include: line features and/or corner features.
- the target feature may also include the object feature of the target object, and the target object is an object disposed above the work area.
- the target object may be a chandelier, a ceiling lamp, a hanging cabinet, and/or a ceiling fan, etc. This embodiment does not limit the type of the target object.
- the object feature of the target object may be a feature vector and/or attribute information obtained through an image recognition algorithm, and this embodiment does not limit the content of the object feature.
- An area map is a map of the work area.
- the area map may be a two-dimensional map; or a three-dimensional map, and the type of the area map is not limited in this embodiment.
- control component 120 can be configured to receive information related to obstacles in the work area; perform corresponding processing according to the information related to the obstacles, the information related to obstacles includes but is not limited to: the proximity sensor 130 information and/or image information collected by the second image collection component 140 .
- the control component 120 determines whether there is a first obstacle in the second plane according to information related to the obstacle; in the case where the first obstacle exists in the second plane, obtains edge information of the second plane; according to the first obstacle The edge information of the two planes adjusts the area outline of the working area.
- the control component 120 identifies the second obstacle in the working area according to the information related to the obstacle; obtains the second position information of the second obstacle in the working area; according to the second position information, marks on the area map obstacle.
- the acquisition range of the second image acquisition component 140 may include a plane on which the mobile device is moved, such as the ground of the work area.
- control component 120 is installed in a self-mobile device as an example for description.
- control component 120 may also be implemented in other devices, such as a mobile phone, a tablet computer, a computer and other user terminals. , the implementation manner of the control component 120 is not limited in this embodiment.
- the target feature is extracted from the first image
- the house outline is determined according to the target feature
- the house outline and the mobile device in the work area are determined according to the house outline and the mobile device in the work area.
- the first location information generates an area map of the work area, and there is no need to move the mobile device to the edge of the work area to obtain the outline of the house, which can improve the efficiency of obtaining the outline of the house, thereby improving the efficiency of map generation.
- FIG. 2 is a flowchart of a method for creating a map from a self-mobile device provided by an embodiment of the present application.
- the method is applied to the self-mobile device shown in FIG. 1 , and the execution subject of each step is the self-mobile device.
- the control component 120 of FIG. 1 is described as an example. The method includes at least the following steps:
- Step 201 acquiring a first image collected from a mobile device in a work area where it moves.
- the target features include line features and/or corner features.
- the self-mobile device may process the first image using an image recognition algorithm to determine whether the first image includes straight line features and/or wall corner features. Since the straight line feature on the roof of the work area is usually the intersection between the first plane and one second plane, the corner feature on the roof is usually the corner formed by the first plane and at least one second plane. Extracting straight line features and/or wall corner features in the first image can determine a first plane and a second plane connected to the first plane.
- the target features include both line features and/or corner features and object features of the target object.
- the target object is the object set above the work area.
- the target object may be a chandelier, a ceiling lamp, a hanging cabinet, and/or a ceiling fan, etc. This embodiment does not limit the type of the target object. Since the target object is usually installed on the roof of the working area, the first plane of the working area can be determined by the object feature of the target object, the second plane and the first plane can be determined based on the first plane in combination with the straight line feature and/or the corner feature The plane intersection between the plane and the second plane.
- the edge information of the side away from the wall including obstacles such as cabinets, sofas or beds can be obtained, and the area contour can be adjusted according to the edge information.
- Step 204 generating an area map of the work area based on the area outline and the first position information of the self-mobile device in the work area.
- a positioning component is installed on the self-moving device, and the positioning component is used to locate the position of the self-moving device in the working area. At this time, when the self-mobile device collects the first image, the positioning information obtained by the positioning component is obtained, and the first position information of the self-mobile device in the working area is obtained.
- the method further includes: identifying a second obstacle in the working area; acquiring second position information of the second obstacle in the working area; Mark the second obstacle on the map.
- the manners of acquiring the second position information of the second obstacle in the working area include but are not limited to the following:
- the first type a proximity sensor is installed on the self-moving device, and the proximity sensor is used to sense an object approaching the self-moving device within a preset range.
- the self-mobile device obtains the proximity distance between the self-mobile device and the second obstacle according to the proximity signal; based on the first position information and the proximity distance of the self-mobile device, determine Second position information of the second obstacle.
- a second image acquisition component is also installed on the self-mobile device, and in the case of receiving the proximity signal sent by the proximity sensor, the second image acquisition component is controlled to collect the image of the obstacle; image processing is performed on the image of the obstacle, A processing result of the obstacle image is obtained, and the processing result may include the proximity distance between the self-mobile device and the second obstacle.
- the second type a second image acquisition component is installed in the mobile device.
- the environment image is collected from the mobile device through the second image acquisition component; image processing is performed on the environment image, and if the environment image includes the image of the obstacle, the processing result of the obstacle image is obtained; the processing result may include the self-mobile device and the image of the obstacle.
- the approach distance of the second obstacle is the second image acquisition component.
- the self-mobile device by identifying the obstacles in the work area and determining the second position information of the obstacles in the area map, the self-mobile device does not need to identify the obstacles again in the subsequent work process, and can adaptively Take corresponding work strategies according to the type of obstacles to improve work efficiency.
- the method further includes: determining, according to the first position information, a working area of the self-mobile device in the working area.
- the self-mobile device may be communicatively connected to the user terminal, and the self-mobile device may convert one of the area map, the target map, the identification result of the obstacle, the determination result of the working area, etc. Multiple pieces of information are sent to the user terminal for display by the user terminal.
- an acquisition module 310 configured to acquire a first image collected from the mobile device in the work area where it moves, where the first image is an image above the work area;
- a determination module 330 configured to determine the area outline of the working area based on the target feature
- the target features include straight line features and/or wall corner features; the determining module 330 is further configured to:
- the area outline of the work area is determined.
- the area contour of the working area is adjusted.
- the target feature further includes the object feature of the target object, and the target object is an object set above the work area; the determining module 330 is further configured to:
- first position information of the self-mobile device in the work area is obtained.
- a target map is generated based on an area map corresponding to each of the one or more work areas.
- FIG. 4 is a block diagram of an apparatus for creating a map from a mobile device provided by an embodiment of the present application, and the apparatus may be the self-mobile device shown in FIG. 1 .
- the apparatus includes at least a processor 401 and a memory 402 .
- Memory 402 may include one or more computer-readable storage media, which may be non-transitory. Memory 402 may also include high-speed random access memory, as well as non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, a non-transitory computer-readable storage medium in the memory 402 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 401 to implement the self-movement provided by the method embodiments in this application. The device's map creation method.
- the apparatus for creating a map from a mobile device may further optionally include: a peripheral device interface and at least one peripheral device.
- the processor 401, the memory 402 and the peripheral device interface can be connected through a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface through bus, signal line or circuit board.
- peripheral devices include, but are not limited to, radio frequency circuits, touch display screens, audio circuits, and power supplies.
- an embodiment of the present application further provides a computer-readable storage medium, where a program is stored in the computer-readable storage medium, and the program is loaded and executed by a processor to realize the map creation from the mobile device according to the above method embodiment. method.
- an embodiment of the present application further provides a computer product, the computer product includes a computer-readable storage medium, and a program is stored in the computer-readable storage medium, and the program is loaded and executed by a processor to implement the above method embodiments method for creating maps from mobile devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Aviation & Aerospace Engineering (AREA)
- Databases & Information Systems (AREA)
- Electromagnetism (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
Claims (19)
- 一种自移动设备的地图创建方法,其特征在于,所述方法包括:获取所述自移动设备在其移动的工作区域采集的第一图像,所述第一图像为所述工作区域上方的图像;根据所述第一图像,提取目标特征,所述目标特征用于指示所述自移动设备正上方的第一平面;基于所述目标特征,确定所述工作区域的区域轮廓;基于所述区域轮廓和所述自移动设备在所述工作区域中的第一位置信息,生成所述工作区域的区域地图。
- 根据权利要求1所述的方法,其特征在于,所述目标特征包括直线特征和/或墙角特征;所述基于所述目标特征,确定所述工作区域的区域轮廓,包括:基于所述直线特征和/或所述墙角特征,确定所述第一平面、与所述第一平面连接的第二平面、以及所述第一平面和所述第二平面的平面交线;基于所述平面交线,确定所述工作区域的区域轮廓。
- 根据权利要求2所述的方法,其特征在于,所述方法还包括:在确定所述第二平面中存在第一障碍物的情况下,获取所述第二平面的边缘信息,所述第一障碍物为靠近所述第二平面且靠近所述工作区域的表面的物体;根据所述第二平面的边缘信息,对所述工作区域的区域轮廓进行调整。
- 根据权利要求3所述的方法,其特征在于,获取所述第二平面的边缘信息包括:在所述自移动设备沿所述第二平面的边缘行驶时,获取所述第二平面的边缘信息;或,接收用户输入的所述第二平面的边缘信息。
- 根据权利要求2所述的方法,其特征在于,所述目标特征还包括目标对象的对象特征,所述目标对象为设置于所述工作区域上方的对象;所述基于所述直线特征和/或所述墙角特征,确定所述第一平面、与所述第一平面连接的第二平面、以及所述第一平面和所述第二平面的平面交线,包括:基于所述对象特征确定所述第一平面;基于所述第一平面结合所述直线特征和/或所述墙角特征,确定与所述第一平面连接的第二平面、以及所述第一平面和所述第二平面的平面交线。
- 根据权利要求5所述的方法,其特征在于,在所述基于所述区域轮廓和所述自移动设备在所述工作区域中的第一位置信息,生成所述工作区域的区域地图之前,还包括:基于所述第一图像,确定所述自移动设备与所述目标对象之间的相对位置关系;基于所述自移动设备与所述目标对象之间的相对位置关系,获得所述自移动设备在所述工作区域中的第一位置信息。
- 根据权利要求6所述的方法,其特征在于,所述相对位置关系包括所述自移动设备与所述目标对象之间的距离和角度。
- 根据权利要求1至7任一项所述的方法,其特征在于,所述方法还包括:基于一个或多个工作区域中每个工作区域对应的区域地图,生成目标地图。
- 根据权利要求1至7任一项所述的方法,其特征在于,所述方法还包括:识别所述工作区域内的第二障碍物;获取所述工作区域内的第二障碍物的第二位置信息;根据所述第二位置信息,在所述区域地图中标记所述第二障碍物。
- 根据权利要求1至7任一项所述的方法,其特征在于,所述方法还包括根据所述第一位置信息确定所述自移动设备在所述工作区域中的已工作区域。
- 一种自移动设备的地图创建装置,其特征在于,所述装置包括:获取模块,用于获取所述自移动设备在其移动的工作区域采集的第一图像,所述第一图像为所述工作区域上方的图像;提取模块,用于根据所述第一图像,提取目标特征,所述目标特征用于指示所述自移动设备正上方的第一平面;确定模块,用于基于所述目标特征,确定所述工作区域的区域轮廓;地图生成模块,用于基于所述区域轮廓和所述自移动设备在所述工作区域中的第一位置信息,生成所述工作区域的区域地图。
- 根据权利要求11所述的装置,其特征在于,所述目标特征包括直线特征和/或墙角特征;所述确定模块还用于:基于所述直线特征和/或所述墙角特征,确定所述第一平面、与所述第一平面连接的第二平面、以及所述第一平面和所述第二平面的平面交线;基于所述平面交线,确定所述工作区域的区域轮廓。
- 根据权利要求12所述的装置,其特征在于,所述地图生成模块还用于:在确定所述第二平面中存在第一障碍物的情况下,获取所述第二平面的边缘信息,所述第一障碍物为靠近所述第二平面且靠近所述工作区域的表面的物体;根据所述第二平面的边缘信息,对所述工作区域的区域轮廓进行调整。
- 根据权利要求12所述的装置,其特征在于,所述目标特征还包括目标对象的对象特征,所述目标对象为设置于所述工作区域上方的对象;所述确定模块还用于:基于所述对象特征确定所述第一平面;基于所述第一平面结合所述直线特征和/或所述墙角特征,确定与所述第一平面连接的第二平面、以及所述第一平面和所述第二平面的平面交线。
- 根据权利要求11所述的装置,其特征在于,在所述地图生成模块之前,还包括定位模块,所述定位模块用于:基于所述第一图像,确定所述自移动设备与所述目标对象之间的相对位置关系;基于所述自移动设备与所述目标对象之间的相对位置关系,获得所述自移动设备在所述工作区域中的第一位置信息。
- 根据权利要求11至15任一项所述的装置,其特征在于,所述地图生成模块还用于:基于一个或多个工作区域中每个工作区域对应的区域地图,生成目标地图。
- 根据权利要求11至15任一项所述的装置,其特征在于,所述装置还包括标记模块,所述标记模块用于:识别所述工作区域内的第二障碍物;获取所述工作区域内的第二障碍物的第二位置信息;根据所述第二位置信息,在所述区域地图中标记所述第二障碍物。
- 一种自移动设备,其特征在于,所述自移动设备包括:图像采集组件,用于采集图像;处理组件,所述处理组件用于:获取所述图像采集组件在工作区域采集的第一图像,所述工作区域为所述自移动设备移动的区域,所述第一图像为所述自移动设备的上方区域的图像;根据所述第一图像,提取目标特征,所述目标特征用于指示所述上方区域的第一平面,所述第一平面为所述自移动设备上方的平面;基于所述目标特征,确定所述工作区域的区域轮廓;基于所述区域轮廓和所述自移动设备在所述工作区域中的第一位置信息,生成所述工作区域的区域地图。
- 一种计算机可读存储介质,其特征在于,所述存储介质中存储有程序,所述程序被处理器执行时用于实现如权利要求1至7任一项所述的自移动设备的地图创建方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237004203A KR20230035363A (ko) | 2020-08-03 | 2021-06-11 | 자율 이동 디바이스를 위한 맵을 생성하기 위한 방법, 장치, 및 디바이스 |
JP2023505459A JP2023535782A (ja) | 2020-08-03 | 2021-06-11 | 自律移動機器の地図確立方法、装置、機器及び記憶媒体 |
EP21852338.9A EP4177790A4 (en) | 2020-08-03 | 2021-06-11 | METHOD AND APPARATUS FOR CREATING CARD FOR SELF-PROPELLED DEVICE AND DEVICE AND STORAGE MEDIUM |
US18/019,778 US20230297120A1 (en) | 2020-08-03 | 2021-06-11 | Method, apparatus, and device for creating map for self-moving device with improved map generation efficiency |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010766273.2A CN111898557B (zh) | 2020-08-03 | 2020-08-03 | 自移动设备的地图创建方法、装置、设备及存储介质 |
CN202010766273.2 | 2020-08-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022028110A1 true WO2022028110A1 (zh) | 2022-02-10 |
Family
ID=73183201
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/099723 WO2022028110A1 (zh) | 2020-08-03 | 2021-06-11 | 自移动设备的地图创建方法、装置、设备及存储介质 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20230297120A1 (zh) |
EP (1) | EP4177790A4 (zh) |
JP (1) | JP2023535782A (zh) |
KR (1) | KR20230035363A (zh) |
CN (1) | CN111898557B (zh) |
WO (1) | WO2022028110A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111898557B (zh) * | 2020-08-03 | 2024-04-09 | 追觅创新科技(苏州)有限公司 | 自移动设备的地图创建方法、装置、设备及存储介质 |
CN116091607B (zh) * | 2023-04-07 | 2023-09-26 | 科大讯飞股份有限公司 | 辅助用户寻找物体的方法、装置、设备及可读存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103886107A (zh) * | 2014-04-14 | 2014-06-25 | 苏州市华天雄信息科技有限公司 | 基于天花板图像信息的机器人定位与地图构建系统 |
US20170131721A1 (en) * | 2015-11-06 | 2017-05-11 | Samsung Electronics Co., Ltd. | Robot cleaner and method for controlling the same |
CN109813285A (zh) * | 2019-01-31 | 2019-05-28 | 莱克电气股份有限公司 | 基于视觉的清洁机器人环境识别方法、存储介质、一种清洁机器人 |
CN109871013A (zh) * | 2019-01-31 | 2019-06-11 | 莱克电气股份有限公司 | 清洁机器人路径规划方法及系统、存储介质、电子设备 |
CN111898557A (zh) * | 2020-08-03 | 2020-11-06 | 追创科技(苏州)有限公司 | 自移动设备的地图创建方法、装置、设备及存储介质 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102403504B1 (ko) * | 2015-11-26 | 2022-05-31 | 삼성전자주식회사 | 이동 로봇 및 그 제어 방법 |
JP2020533720A (ja) * | 2017-09-12 | 2020-11-19 | ロブアート ゲーエムベーハーROBART GmbH | 自律移動ロボットによる未知の環境の探索 |
US11614746B2 (en) * | 2018-01-05 | 2023-03-28 | Irobot Corporation | Mobile cleaning robot teaming and persistent mapping |
CN110353583A (zh) * | 2019-08-21 | 2019-10-22 | 追创科技(苏州)有限公司 | 扫地机器人及扫地机器人的自动控制方法 |
CN110956690A (zh) * | 2019-11-19 | 2020-04-03 | 广东博智林机器人有限公司 | 一种建筑信息模型生成方法和系统 |
-
2020
- 2020-08-03 CN CN202010766273.2A patent/CN111898557B/zh active Active
-
2021
- 2021-06-11 WO PCT/CN2021/099723 patent/WO2022028110A1/zh unknown
- 2021-06-11 JP JP2023505459A patent/JP2023535782A/ja active Pending
- 2021-06-11 KR KR1020237004203A patent/KR20230035363A/ko active Search and Examination
- 2021-06-11 EP EP21852338.9A patent/EP4177790A4/en not_active Withdrawn
- 2021-06-11 US US18/019,778 patent/US20230297120A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103886107A (zh) * | 2014-04-14 | 2014-06-25 | 苏州市华天雄信息科技有限公司 | 基于天花板图像信息的机器人定位与地图构建系统 |
US20170131721A1 (en) * | 2015-11-06 | 2017-05-11 | Samsung Electronics Co., Ltd. | Robot cleaner and method for controlling the same |
CN109813285A (zh) * | 2019-01-31 | 2019-05-28 | 莱克电气股份有限公司 | 基于视觉的清洁机器人环境识别方法、存储介质、一种清洁机器人 |
CN109871013A (zh) * | 2019-01-31 | 2019-06-11 | 莱克电气股份有限公司 | 清洁机器人路径规划方法及系统、存储介质、电子设备 |
CN111898557A (zh) * | 2020-08-03 | 2020-11-06 | 追创科技(苏州)有限公司 | 自移动设备的地图创建方法、装置、设备及存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4177790A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP2023535782A (ja) | 2023-08-21 |
US20230297120A1 (en) | 2023-09-21 |
CN111898557B (zh) | 2024-04-09 |
EP4177790A1 (en) | 2023-05-10 |
EP4177790A4 (en) | 2023-09-06 |
KR20230035363A (ko) | 2023-03-13 |
CN111898557A (zh) | 2020-11-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111989537B (zh) | 用于在无约束环境中检测人类视线和手势的系统和方法 | |
Song et al. | Sun rgb-d: A rgb-d scene understanding benchmark suite | |
CN108885459B (zh) | 导航方法、导航系统、移动控制系统及移动机器人 | |
Borrmann et al. | A mobile robot based system for fully automated thermal 3D mapping | |
CN109074083B (zh) | 移动控制方法、移动机器人及计算机存储介质 | |
AU2016228125B2 (en) | Structure modelling | |
WO2022078467A1 (zh) | 机器人自动回充方法、装置、机器人和存储介质 | |
Kragic et al. | Vision for robotic object manipulation in domestic settings | |
WO2022028110A1 (zh) | 自移动设备的地图创建方法、装置、设备及存储介质 | |
JP2019003621A (ja) | 建築物のレイアウトの決定 | |
WO2020248458A1 (zh) | 一种信息处理方法、装置及存储介质 | |
CN109551476A (zh) | 结合云服务系统的机器人系统 | |
CN104781849A (zh) | 单眼视觉同时定位与建图(slam)的快速初始化 | |
CN104732587A (zh) | 一种基于深度传感器的室内3d语义地图构建方法 | |
CN109933061A (zh) | 基于人工智能的机器人及控制方法 | |
Tomono | 3-d object map building using dense object models with sift-based recognition features | |
WO2020244121A1 (zh) | 一种地图信息处理方法、装置及移动设备 | |
CN108903816A (zh) | 一种清扫方法、控制器及智能清扫设备 | |
Chen et al. | Design and Implementation of AMR Robot Based on RGBD, VSLAM and SLAM | |
Xompero et al. | Multi-view shape estimation of transparent containers | |
CN112655021A (zh) | 图像处理方法、装置、电子设备和存储介质 | |
US20230184949A1 (en) | Learning-based system and method for estimating semantic maps from 2d lidar scans | |
Liu | Semantic mapping: a semantics-based approach to virtual content placement for immersive environments | |
CN105631938A (zh) | 一种图像处理方法及电子设备 | |
CN111419117B (zh) | 视觉扫地机器人返航控制方法及视觉扫地机器人 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21852338 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023505459 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20237004203 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021852338 Country of ref document: EP Effective date: 20230202 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |