WO2021135658A1 - 自主机器人的地图创建方法、装置、设备及存储介质 - Google Patents
自主机器人的地图创建方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2021135658A1 WO2021135658A1 PCT/CN2020/128027 CN2020128027W WO2021135658A1 WO 2021135658 A1 WO2021135658 A1 WO 2021135658A1 CN 2020128027 W CN2020128027 W CN 2020128027W WO 2021135658 A1 WO2021135658 A1 WO 2021135658A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- boundary
- autonomous robot
- map
- initial
- detection
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000003860 storage Methods 0.000 title claims abstract description 16
- 238000001514 detection method Methods 0.000 claims description 113
- 238000012544 monitoring process Methods 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 28
- 231100001261 hazardous Toxicity 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 9
- 238000003384 imaging method Methods 0.000 claims description 7
- 238000012937 correction Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 238000009826 distribution Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 244000025254 Cannabis sativa Species 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009182 swimming Effects 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0242—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/648—Performing a task within a working area or space, e.g. cleaning
- G05D1/6484—Performing a task within a working area or space, e.g. cleaning by taking into account parameters or characteristics of the working area or space, e.g. size or shape
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0259—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
- G05D1/0263—Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0285—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
Definitions
- This specification relates to the technical field of autonomous robots, in particular to a map creation method, device, equipment and storage medium of an autonomous robot.
- Autonomous robots generally need to create a map in advance (hereinafter referred to as map creation) to move and perform tasks based on the map.
- map creation some autonomous robot maps are created by users who traverse the boundaries of the work area by detecting on-site operation of a mapping device with positioning function to delineate the work area, and then manually draw the distribution of obstacles on this basis.
- this manual map creation method is labor-intensive and low-efficiency.
- the purpose of the embodiments of this specification is to provide a map creation method, device, equipment and storage medium of an autonomous robot, so as to improve the map creation accuracy and map creation efficiency of the autonomous robot.
- an embodiment of this specification provides a method for creating a map of an autonomous robot, including:
- the obtaining the initial map selected from the target database includes: sending a plot selection request carrying plot identification information to a server; receiving a plot selection response carrying a plot map, and using the plot map as the initial map
- the plot map is obtained by the server according to the plot identification information from the plot database matching; the plot identification information includes any one of the following: plot identifier; corresponding to the plot Location address;
- the enabling the autonomous robot to perform boundary detection includes: judging whether a boundary point is detected according to the detection data collected by the boundary detection device of the autonomous robot; when the boundary point is detected, recording the heading angle and the heading angle of the autonomous robot at this time
- the first coordinates output by the positioning device of the autonomous robot; determine the first coordinate of the boundary point according to the first coordinates, the heading angle, the distance between the positioning device and the boundary detection device, and the detection data Two coordinates; forming a measured boundary according to the second coordinates of each detected boundary point;
- the boundary detection device includes a vision sensor;
- keeping the autonomous robot at a preset safe distance from the initial boundary includes : The autonomous robot is kept at a first safe distance from the dangerous boundary section in the initial boundary, and the autonomous robot is kept at a second safe distance from the non-hazardous boundary section in the initial boundary, the second safe distance Less than the first safety distance;.
- the autonomous robot In the process of enabling the autonomous robot to perform boundary detection, confirm whether the boundary section to be detected is a dangerous boundary section; when the boundary section to be detected is a dangerous boundary section, confirm whether there is manual monitoring at the detection site; During manual monitoring, the autonomous robot is allowed to start or continue to perform boundary detection; when it is confirmed that there is no manual monitoring at the detection site, the autonomous robot is stopped; when the boundary section to be detected is a non-dangerous boundary section, the autonomous robot is caused Start or continue border detection;
- the confirming whether there is manual monitoring at the detection site includes any one or more of the following: based on whether the wireless communication module of the autonomous robot receives a wireless signal transmitted by a designated device, confirming whether there is manual monitoring at the detection site;
- the designated equipment is carried by on-site monitoring personnel; based on whether the imaging detector of the autonomous robot detects human signals within the designated detection radius, it is confirmed whether there is manual monitoring at the detection site;
- the correcting the initial boundary according to the measured boundary includes: replacing the initial boundary with the measured boundary.
- an autonomous robot map creation device including:
- An initial boundary acquisition module for acquiring an initial map selected from the target database, the initial map including the initial boundary
- the measured boundary acquisition module is used to enable the autonomous robot to perform boundary detection to obtain the measured boundary; the positioning accuracy of the autonomous robot meets a preset condition;
- the initial boundary correction module is configured to correct the initial boundary according to the measured boundary.
- the embodiment of the present specification also provides an autonomous robot equipped with the above-mentioned map creation device.
- the embodiment of this specification also provides a computer storage medium on which a computer program is stored, and the computer program is executed by a processor to realize the above-mentioned map creation method.
- the embodiments of this specification can make the autonomous robot perform boundary detection to obtain the measured boundary; since the positioning accuracy of the autonomous robot satisfies Preset conditions so that the measured boundary has higher accuracy than the initial boundary.
- the obtained work area map has higher accuracy.
- the embodiment of the present specification can directly obtain the initial map selected from the target database, and can automatically optimize the initial boundary according to the initial map, thereby also improving the map creation efficiency of the autonomous robot.
- Figure 1 is a schematic diagram of an autonomous robot in some embodiments of this specification.
- Figure 2 is a structural block diagram of an autonomous robot in some embodiments of this specification.
- Figure 3 is a schematic diagram of selecting a working area in some embodiments of this specification.
- FIG. 4 is a schematic diagram of selecting a working area in some other embodiments of this specification.
- Fig. 5 is a schematic diagram of an initial map of a working area in an embodiment of the specification
- Fig. 6 is a schematic diagram of boundary detection of an autonomous robot in some embodiments of this specification.
- FIG. 7 is a schematic diagram of determining boundary point coordinates in some embodiments of this specification.
- FIG. 8 is a schematic diagram of setting a safety distance in a working area in an embodiment of this specification.
- FIG. 9 is a schematic diagram of correcting the initial boundary according to the measured boundary in some embodiments of this specification.
- FIG. 10 is a schematic diagram of a revised work area map in some embodiments of this specification.
- Fig. 11 is a flowchart of a boundary detection method of an autonomous robot in some embodiments of this specification.
- the autonomous robot 100 (or self-moving robot) of some embodiments of this specification is equipped with various necessary sensors and control devices in its body, and there is no external human information input and control conditions during operation.
- a robot that can independently complete a certain task that is, the autonomous robot 100 can autonomously move in the work area 200 and perform work tasks.
- the autonomous robot 100 may include a smart lawn mower, an automatic cleaning device, an automatic watering device, or an automatic snow sweeper.
- the autonomous robot of the embodiment of the present specification may be equipped with a map creation device, which is intended to improve the map creation accuracy and map creation efficiency of the autonomous robot.
- the map creation device may include an initial boundary acquisition module 21, a measured boundary acquisition module 22 and an initial boundary correction module.
- the initial boundary acquiring module 21 may be used to acquire an initial map selected from the target database, the initial map including the initial boundary.
- the actual measurement boundary acquisition module 22 may be used to enable the autonomous robot to perform boundary detection to obtain the actual measurement boundary; the positioning accuracy of the autonomous robot meets a preset condition.
- the initial boundary correction module 23 may be used to correct the initial boundary according to the measured boundary.
- the map creation device can make the autonomous robot perform boundary detection to obtain the measured boundary; since the positioning accuracy of the autonomous robot meets the preset Conditions, so that the measured boundary has a higher accuracy relative to the initial boundary.
- the obtained work area map has higher accuracy.
- the map creation device can directly obtain the initial map selected from the target database, and can automatically optimize the initial boundary according to the initial map, thereby improving the map creation efficiency of the autonomous robot and improving the user experience.
- the target database may be a parcel database (or called a parcel set).
- the initial boundary acquisition module 21 can acquire the initial map selected from the target database in any suitable manner, so as to reduce the labor intensity of the user and improve the efficiency of map creation.
- the initial maps of multiple plots are pre-stored in the plot database for the user to select.
- the initial map of each plot in the plot database can be determined by the manufacturer of autonomous robots (or other parties such as service providers) based on the administrative agency’s geographic information system (GIS) (e.g., land resource database, etc.), Electronic maps (such as Google Earth, Baidu Maps, Google Maps, etc.) of enterprises or other organizations are generated.
- GIS geographic information system
- Electronic maps such as Google Earth, Baidu Maps, Google Maps, etc.
- the map accuracy of the geographic information system and the electronic map is generally at the civilian level, so that the accuracy of the initial map obtained by the initial boundary acquisition module 21 is generally not high.
- the autonomous robot can communicate with the server.
- the initial boundary acquisition module 21 of the autonomous robot may send a plot selection request carrying plot identification information to the server, receive a plot selection response carrying a plot map, and use the plot map as the initial map ;
- the plot map may be obtained by the server according to the plot identification information from the plot database.
- the plot identification information may be, for example, a plot identifier, or a location address corresponding to the plot.
- the initial boundary acquisition module 21 may obtain the plot identification information based on the user's input operation.
- the plot identifier may be a character string (for example, Sudi 2019-WG-8), which is used to uniquely identify a plot map.
- the location address corresponding to the plot may be a communication address of the plot.
- the autonomous robot can communicate with a third party.
- an autonomous robot may receive an initial map sent by a third party; the initial map may be selected by the third party from a land parcel database.
- the third party can initiate a plot selection request carrying plot identification information to the server based on the user's input operation; the server can match the corresponding plot map from the plot database according to the plot identification information. And return it to the third party; and the third party can provide the plot map to the autonomous robot.
- the initial boundary acquisition module 21 of the autonomous robot obtains the plot map, it can be used as the initial map.
- the target database may also be an electronic map database (hereinafter referred to as electronic map).
- the third party can form a plot map selected by the user based on the user's area selection operation on an electronic map (such as Google Earth, Baidu map, Google map, etc.), and send it to the autonomous robot.
- an electronic map such as Google Earth, Baidu map, Google map, etc.
- the initial boundary acquisition module 21 of the autonomous robot receives the plot map, it can use it as the initial map.
- the area selection operation may be, for example, that the user uses a finger, a mouse, or a keyboard, etc., to encircle an area on an electronic map displayed by a third party as the initial map.
- the above-mentioned third party may be a desktop computer, a tablet computer, a notebook computer, a smart phone, a digital assistant, a smart wearable device, and the like.
- smart wearable devices may include smart bracelets, smart watches, smart glasses, smart helmets, and so on.
- the third party may not be limited to the above-mentioned electronic equipment with a certain entity, and it may also be software running in the above-mentioned electronic equipment.
- the server may be an electronic device with computing and network interaction functions; it may also be software that runs on the electronic device and provides business logic for data processing and network interaction.
- communication between the third party and the server, between the autonomous robot and the server, and between the third party and the autonomous robot can be through wired or wireless communication, so as to realize data interaction.
- the third party when it is a smart phone, it can communicate with the server through a mobile communication network, and can communicate with an autonomous robot through Bluetooth or other methods.
- the positioning accuracy of the autonomous robot meets the preset condition may mean that the positioning device of the autonomous robot has high positioning accuracy to meet the high-precision positioning requirements.
- the positioning accuracy of the positioning device of the autonomous robot can reach the decimeter level, centimeter level or even higher, for example.
- the positioning device of the autonomous robot may be a differential positioning device based on a differential positioning technology (for example, RTK (Real-time kinematic, real-time dynamic) carrier phase difference technology).
- the initial map of the work area is generally a flat map, which shows the initial boundary (ie outline) of the work area, the distribution of obstacles in and around the work area, etc.
- the initial map is also The coordinate information of each position point is arranged.
- the plot m is a roughly rectangular plot divided by multiple roads, and there are obstacles such as houses and swimming pools distributed in the plot m on the south side of the plot m. Adjacent to the river.
- the accuracy of the initial map of each plot in the plot database is limited. Therefore, in order to improve the map creation accuracy of the autonomous robot, it is necessary to perform boundary detection on the initial boundary to optimize the initial boundary based on the measured boundary.
- the boundary detection device such as the white circle mark in FIG. 6
- the positioning device such as the black triangle mark in FIG. 6 configured by the autonomous robot
- the measured boundary acquisition module 22 The autonomous robot can perform boundary detection along the initial boundary of the work area (for example, as shown in FIG. 6) to obtain the actual boundary.
- the boundary detection device may include, but is not limited to, one or more of a vision sensor, a multispectral sensor, a capacitive proximity sensor, and a radar sensor.
- a vision sensor a multispectral sensor
- a capacitive proximity sensor a capacitive proximity sensor
- a radar sensor a radar sensor
- the autonomous robot takes the autonomous robot to perform boundary detection along the initial boundary as an example to introduce the boundary detection process of the measured boundary acquisition module 22.
- the autonomous robot can also use any other suitable method to perform boundary detection.
- along the initial boundary in this specification may mean that the autonomous robot moves in the working area in a manner close to the initial boundary, and the overall trend of its moving trajectory is along the initial boundary.
- the autonomous robot can close its job execution mechanism in the process of boundary detection to improve the safety of boundary detection. For example, taking the smart lawn mower as an example, the cutter head can be turned off during the boundary detection process.
- the measured boundary acquisition module 22 may determine whether a boundary point is detected based on the detection data collected by the boundary detection device of the autonomous robot. For example, taking a smart lawn mower as an example, the working area of the smart lawn mower is usually grass, and the edges of the grass are generally roads, rivers, fences, or fences. Therefore, by performing image recognition on the image collected by the visual sensor of the smart lawn mower as the boundary detection device, it can be determined whether the boundary is detected.
- the measured boundary acquisition module 22 can record the heading angle of the autonomous robot and the first coordinate output by the positioning device of the autonomous robot at this time; then, it can be based on the first coordinate and the heading angle. , The distance between the positioning device and the boundary detection device and the detection data determine the second coordinate of the boundary point. In this way, the measured boundary can be formed according to the second coordinates of each detected boundary point.
- point A is the location of the positioning device of the autonomous robot
- point B is the location of the boundary detection device of the autonomous robot
- the heading of the autonomous robot is in the direction of the dotted line with arrow in FIG.
- the heading angle of the autonomous robot recorded at this time is ⁇
- the coordinates of point A recorded are (x 0 , y 0 ). Since the positioning device and the boundary detection device are both pre-fixed on the autonomous robot, the distance AB (that is, the length of c in Fig. 7) is known.
- the coordinates of point B (x 1 , y 1 ) can be calculated according to the heading angle ⁇ and the coordinates of point A (x 0 , y 0 ), because the boundary detection device of the autonomous robot can measure the boundary point ( That is, the distance BC from the point C in FIG. 7 (that is, the length of a in FIG. 7).
- the boundary detection device is a radar sensor
- the distance BC can be measured by the radar sensor
- the boundary detection device is a single vision sensor
- the distance BC can also be measured based on the monocular vision positioning and ranging principle.
- the second coordinate of each detected boundary point can be obtained through the above-mentioned method, and then the measured boundary can be formed according to the second coordinate of these boundary points.
- the initial boundary correction module 23 can correct the initial boundary according to the measured boundary, that is, the initial boundary correction module 23 can replace the initial boundary with the measured boundary, so as to optimize the initial map of the working area, that is, obtain A more accurate work map was created.
- the autonomous robot performs boundary detection along the initial boundary (see the rectangle shown by the thick solid line in FIG. 9) of the work area (that is, plot m), and obtains 9.
- replacing the initial boundary with the measured boundary can obtain a more accurate working area boundary (see the closed boundary enclosed by the thick dashed line in Figure 10).
- the measured boundary acquisition module 22 may also keep the autonomous robot and the initial boundary at preset safety during the process of enabling the autonomous robot to perform boundary detection along the initial boundary. distance.
- part or all of the boundary of the work area may be dangerous areas such as roads, rivers, swimming pools, or cliffs.
- the measured boundary acquisition module 22 may keep the autonomous robot and the dangerous boundary section in the initial boundary slightly larger A safety distance of one point (for example, 30% larger than the default safety distance), for example, as shown in d2 in FIG. 8.
- the dangerous boundary section may refer to the boundary section adjacent to the dangerous area in the initial boundary.
- part or all of the boundary of the work area may also be non-hazardous areas such as fences and walls. Therefore, on the premise of ensuring the safety of autonomous robots, in order to take into account the coverage of autonomous robots, for non-dangerous boundary segments in the initial boundary, in the process of making the autonomous robot perform boundary detection along the initial boundary, the The measured boundary acquisition module 22 can keep the autonomous robot and the non-hazardous boundary section a slightly smaller safe distance (for example, a safe distance than the default), for example, as shown in d1 in FIG. 8.
- the non-hazardous boundary section may refer to the boundary section adjacent to the non-hazardous area in the initial boundary, or the boundary section that is not adjacent to the hazardous area in the initial boundary.
- the identification of the dangerous area at the boundary of the work area can be automatically determined by the measured boundary acquisition module 22 through image recognition, or it can be specified by the user, which can be determined according to actual needs. Make a limit.
- the measured boundary acquisition module 22 can automatically set a safe distance for the hazardous area and the non-hazardous area.
- the measured boundary acquisition module 22 can also confirm whether the boundary section to be detected is a dangerous boundary section, so as to implement on-site manual monitoring of the dangerous boundary section. Since the dangerous boundary section and the non-dangerous boundary section in the initial boundary have been divided before the boundary detection is performed, based on the result of this division, the measured boundary acquisition module 22 can confirm whether the boundary section to be detected is a dangerous boundary section.
- the actual measurement boundary acquisition module 22 can further confirm whether there is manual monitoring at the detection site. When it is confirmed that there is manual monitoring at the detection site, the measured boundary acquisition module 22 may cause the autonomous robot to start or continue to perform boundary detection. When it is confirmed that there is no manual monitoring at the detection site, the actual measurement boundary acquisition module 22 stops the autonomous robot and can also issue an alarm to further improve the security of the autonomous robot's boundary detection.
- the actual measurement boundary acquisition module 22 can detect whether there is manual monitoring on site by any suitable method, which is not limited in this description, and can be specifically selected according to needs. For example, in some embodiments, based on whether the wireless communication module of the autonomous robot receives a wireless signal transmitted by a designated device, it can be confirmed whether there is manual monitoring at the detection site.
- the designated device is carried by on-site monitoring personnel and can continuously transmit wireless signals to the outside. Therefore, when the wireless signal transmitted by the designated device is received, it can be inferred that there is manual monitoring at the detection site.
- the wireless signal may be, for example, Bluetooth, WiFi, or the like.
- the imaging detector may be, for example, an infrared thermal imaging detector or an imaging radar.
- the above-mentioned designated device may be any portable device with the above-mentioned functions, for example, it may include, but is not limited to, a smart phone, a tablet computer, a notebook computer, a digital assistant, or a smart wearable device.
- map creation method of an autonomous robot may include the following steps:
- the obtaining the initial map selected from the target database may include:
- the plot identification information may include any one of the following:
- the obtaining the initial map selected from the target database may include:
- An initial map sent by a third party is received; the initial map is selected by the third party from a land parcel database.
- the obtaining the initial map selected from the target database may further include:
- An initial map sent by a third party is received; the initial map is selected by the third party from a land parcel database.
- the enabling the autonomous robot to perform boundary detection may include:
- the measured boundary is formed according to the second coordinates of each detected boundary point.
- the boundary detection device includes a visual sensor.
- the autonomous robot In the process of enabling the autonomous robot to perform boundary detection, the autonomous robot is maintained at a preset safe distance from the initial boundary.
- the maintaining a preset safe distance between the autonomous robot and the initial boundary may include:
- the autonomous robot is kept at a first safe distance from the dangerous boundary segment in the initial boundary.
- the maintaining a preset safe distance between the autonomous robot and the initial boundary may further include:
- the autonomous robot maintains a second safety distance from the non-hazardous boundary segment in the initial boundary, where the second safety distance is smaller than the first safety distance.
- the autonomous robot is caused to start or continue to perform boundary detection.
- the autonomous robot is stopped.
- the autonomous robot is caused to start or continue to perform boundary detection.
- the confirmation of whether there is manual monitoring at the detection site may include any one or more of the following:
- the wireless communication module of the autonomous robot Based on whether the wireless communication module of the autonomous robot receives the wireless signal transmitted by the designated device, confirm whether there is manual monitoring at the detection site; the designated device is carried by the on-site monitoring personnel;
- the imaging detector of the autonomous robot Based on whether the imaging detector of the autonomous robot detects a human body signal within a designated detection radius, it is confirmed whether there is manual monitoring at the detection site.
- the correcting the initial boundary according to the measured boundary may include:
- These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
- the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
- the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- the computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- the memory may include non-permanent memory in a computer readable medium, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
- RAM random access memory
- ROM read-only memory
- flash RAM flash memory
- Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
- the information can be computer-readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cartridges, disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
- PRAM phase change memory
- SRAM static random access memory
- DRAM dynamic random access memory
- RAM random access memory
- ROM read-only memory
- EEPROM electrically erasable programmable read-only memory
- flash memory or other memory technology
- CD-ROM compact disc
- this specification can be provided as a method, a system or a computer program product. Therefore, this specification may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, this specification can take the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
- computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
- program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement specific abstract data types.
- This specification can also be practiced in distributed computing environments where tasks are performed by remote processing devices connected through a communication network.
- program modules can be located in local and remote computer storage media including storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
一种自主机器人(100)的地图创建方法、装置、设备及存储介质,自主机器人(100)的地图创建方法包括:获取从目标数据库中选定的初始地图,初始地图包括初始边界(S111);使自主机器人(100)执行边界探测,以获取实测边界;自主机器人(100)的定位精度满足预设条件(S112);根据实测边界修正初始边界(S113)。自主机器人(100)的地图创建方法可以提高自主机器人(100)的地图创建精度及地图创建效率。
Description
本申请要求了申请日为2020年01月02日,申请号为202010002699.0的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本说明书涉及自主机器人技术领域,尤其是涉及一种自主机器人的地图创建方法、装置、设备及存储介质。
自主机器人一般需要预先创建地图(以下简称地图创建),以基于该地图移动并执行作业任务。目前,一些自主机器人的地图创建,是由用户通过探测现场操作带定位功能的绘图设备遍历工作区域边界,以圈定工作区域,然后在此基础上人工绘制出障碍物分布。显然,这种人工地图创建方式的劳动强度较大,效率低。
此外,也有一些用户尝试利用谷歌地球等商用地图软件,人工圈定工作区域和障碍物分布。相对而言,这种基于商用地图软件的地图创建方式可以大幅降低地图创建的劳动强度,并可以提高地图创建效率。但是,由于这些商用地图软件的精度有限,其所建地图的精度可能难以满足要求。
因此,如何提高自主机器人的地图创建精度及地图创建效率,已成为目前亟待解决的技术问题。
发明内容
本说明书实施例的目的在于提供一种自主机器人的地图创建方法、装置、设备及存储介质,以提高自主机器人的地图创建精度及地图创建效率。
为达到上述目的,一方面,本说明书实施例提供了一种自主机器人的地图创建方法,包括:
获取从目标数据库中选定的初始地图,所述初始地图包括初始边界;
使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足 预设条件;根据所述实测边界修正所述初始边界;
所述获取从目标数据库中选定的初始地图,包括:向服务器发送携带地块标识信息的地块选择请求;接收携带地块地图的地块选择响应,并将所述地块地图作为初始地图;所述地块地图由所述服务器根据所述地块标识信息从地块数据库中匹配得到;所述地块标识信息包括以下中的任意一种:地块标识符;与所述地块对应的位置地址;
所述使自主机器人执行边界探测,包括:根据所述自主机器人的边界探测装置采集的探测数据,判断是否探测到边界点;当探测到边界点时,记录此时所述自主机器人的航向角及所述自主机器人的定位装置输出的第一坐标;根据所述第一坐标、所述航向角、所述定位装置与所述边界探测装置的距离以及所述探测数据,确定所述边界点的第二坐标;根据各个被探测到的边界点的第二坐标形成实测边界;所述边界探测装置包括视觉传感器;
在所述使自主机器人执行边界探测的过程中,使所述自主机器人与所述初始边界保持预设的安全距离;所述使所述自主机器人与所述初始边界保持预设的安全距离,包括:使所述自主机器人与所述初始边界中的危险边界段保持第一安全距离、使所述自主机器人与所述初始边界中的非危险边界段保持第二安全距离,所述第二安全距离小于所述第一安全距离;。
在所述使自主机器人执行边界探测的过程中,确认待探测边界段是否为危险边界段;当所述待探测边界段为危险边界段时,确认探测现场是否有人工监视;当确认探测现场有人工监视时,使所述自主机器人开始或继续执行边界探测;当确认探测现场无人工监视,使所述自主机器人停机;当所述待探测边界段为非危险边界段时,使所述自主机器人开始或继续执行边界探测;
所述确认探测现场是否有人工监视,包括以下中的任意一种或多种:基于所述自主机器人的无线通信模块是否接收到指定设备发射的无线信号,确认探测现场是否有人工监视;所述指定设备由现场监视人员携带;基于所述自主机器人的成像探测器是否在指定探测半径内探测到人体信号,确认探测现场是否有人工监视;
所述根据所述实测边界修正所述初始边界,包括:用所述实测边界替换所述初始边界。
另一方面,本说明书实施例还提供了一种自主机器人的地图创建装置,包括:
初始边界获取模块,用于获取从目标数据库中选定的初始地图,所述初始地图包 括初始边界;
实测边界获取模块,用于使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足预设条件;
初始边界修正模块,用于根据所述实测边界修正所述初始边界。
另一方面,本说明书实施例还提供了一种自主机器人,所述自主机器人配置有上述的地图创建装置。
另一方面,本说明书实施例还提供了一种计算机存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述的地图创建方法。
由以上本说明书实施例提供的技术方案可见,本说明书实施例在获取到从目标数据库中选定的初始地图后,可以使自主机器人执行边界探测,以获取实测边界;由于自主机器人的定位精度满足预设条件,从而使得的实测边界相对于初始边界具有更高的精度,当根据实测边界修正初始边界后,所获得的工作区域地图具有更高的精度。不仅如此,由于本说明书实施例可以直接获取从目标数据库中选定的初始地图,并可以根据初始地图自动优化初始边界,从而也提高了自主机器人的地图创建效率。
为了更清楚地说明本说明书实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本说明书中记载的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。在附图中:
图1为本说明书一些实施例中自主机器人的示意图;
图2为本说明书一些实施例中自主机器人的结构框图;
图3为本说明书一些实施例中选择工作区域的示意图;
图4为本说明书另一些实施例中选择工作区域的示意图;
图5为本说明书一实施例中工作区域的初始地图示意图;
图6为本说明书一些实施例中自主机器人的边界探测示意图;
图7为本说明书一些实施例中确定边界点坐标的示意图;
图8为本说明书一实施例中工作区域内设置安全距离的示意图;
图9为本说明书一些实施例中根据实测边界修正初始边界的示意图;
图10为本说明书一些实施例中修正后的工作区域地图示意图;
图11为本说明书一些实施例中自主机器人的边界探测方法的流程图。
为了使本技术领域的人员更好地理解本说明书中的技术方案,下面将结合本说明书实施例中的附图,对本说明书实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本说明书一部分实施例,而不是全部的实施例。基于本说明书中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都应当属于本说明书保护的范围。
参考图1所示,本说明书一些实施例的自主机器人100(或称为自移动机器人)是其本体自带各种必要的传感器、控制装置,在运行过程中无外界人为信息输入和控制的条件下,可以独立完成一定的任务的机器人,即自主机器人100可以在工作区域200内自主移动并执行作业任务。例如,在本说明书一些示例性实施例中,所述自主机器人100可以包括智能割草机、自动清洁设备、自动浇灌设备或自动扫雪机等。
本说明书实施例的自主机器人可以配置有地图创建装置,所述地图创建装置旨在提高自主机器人的地图创建精度及地图创建效率。结合图2所示,在本说明书一些实施例中,所述地图创建装置可以包括初始边界获取模块21、实测边界获取模块22和初始边界修正模块。其中,初始边界获取模块21可以用于获取从目标数据库中选定的初始地图,所述初始地图包括初始边界。实测边界获取模块22可以用于使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足预设条件。初始边界修正模块23可以用于根据所述实测边界修正所述初始边界。
由此可见,在本说明书上述实施例中,基于直接获取从目标数据库中选定的初始地图,地图创建装置可以使自主机器人执行边界探测,以获取实测边界;由于自主机器人的定位精度满足预设条件,从而使得的实测边界相对于初始边界具有更高的精度,当根据实测边界修正初始边界后,所获得的工作区域地图具有更高的精度。不仅如此,由于地图创建装置可以直接获取从目标数据库中选定的初始地图,并可以根据初始地图自动优化初始边界,从而也提高了自主机器人的地图创建效率,提高了用户体验。
在本说明书一些实施例中,目标数据库可以是地块数据库(或称为地块集合)。初始边界获取模块21可以通过任何合适的方式获取从目标数据库中选定的初始地图, 以降低用户的劳动强度,提高地图创建效率。其中,地块数据库中预先保存有多个地块的初始地图,以供用户选择。地块数据库中各个地块的初始地图,可以由自主机器人的制造商(或服务提供方等其他方)基于行政机构的地理信息系统(Geographic Information System,简称GIS)(例如土地资源库等)、企业机构或其他组织机构的电子地图(例如谷歌地球、百度地图、谷歌地图等)生成。通常,基于需求、成本等方面的考虑,地理信息系统和电子地图的地图精度一般为民用级别,从而使得初始边界获取模块21获得的初始地图的精度普遍不高。
在本说明书一些实施例中,自主机器人可以与服务器通信。例如图3所示,自主机器人的初始边界获取模块21可以向服务器发送携带地块标识信息的地块选择请求,接收携带地块地图的地块选择响应,并将所述地块地图作为初始地图;所述地块地图可由所述服务器根据所述地块标识信息从地块数据库中匹配得到。其中,所述地块标识信息例如可以为地块标识符,或与所述地块对应的位置地址等。在发起地块选择请求前,初始边界获取模块21可以基于用户的输入操作获得所述地块标识信息。例如,在一示例性实施例中,地块标识符可以为一个字符串(例如苏地2019-WG-8),用于唯一标识一个地块地图。再如,在另一示例性实施例中,与所述地块对应的位置地址可以为地块的通信地址。
在本说明书另一些实施例中,自主机器人可以与第三方通信。例如,自主机器人可以接收第三方发送的初始地图;所述初始地图可由所述第三方从地块数据库选取得到。如图4所示,第三方可基于用户的输入操作向服务器发起携带地块标识信息的地块选择请求;服务器根据所述地块标识信息可以从地块数据库中匹配出对应的地块地图,并将其返回给第三方;而第三方可以将所述地块地图提供给自主机器人。相应的,自主机器人的初始边界获取模块21在得到所述地块地图后,可以将其作为初始地图。
在本说明书另一些实施例中,目标数据库也可以是电子地图数据库(以下简称电子地图)。相应地,第三方可基于用户在电子地图(例如谷歌地球、百度地图、谷歌地图等)上的区域选择操作,形成用户所选定的地块地图,并将其发送给自主机器人。相应的,自主机器人的初始边界获取模块21在接收到所述地块地图后,可以将其作为初始地图。其中,所述区域选择操作例如可以是用户通过手指、鼠标或键盘等,在第三方展示的电子地图上圈定出的区域作为初始地图。
在本说明书一些实施例中,上述第三方可以为台式电脑、平板电脑、笔记本电脑、 智能手机、数字助理、智能可穿戴设备等。其中,智能可穿戴设备可以包括智能手环、智能手表、智能眼镜、智能头盔等。当然,所述第三方可并不限于上述具有一定实体的电子设备,其还可以为运行于上述电子设备中的软体。
在本说明书一些实施例中,服务器可以为具有运算和网络交互功能的电子设备;也可以为运行于该电子设备中,为数据处理和网络交互提供业务逻辑的软体。
在本说明书一些实施例中,第三方与服务器之间,自主机器人与服务器之间,以及第三方与自主机器人之间可以通过有线或无线等方式进行通信,从而实现数据交互。例如,在一个典型应用场景中,当第三方为智能手机时,其可以通过移动通信网络与服务器进行通信,并可通过蓝牙等方式与自主机器人进行通信。
在本说明书一些实施例中,自主机器人的定位精度满足预设条件可以是指自主机器人的定位装置具有较高的定位精度,以满足高精度定位要求。在一些实施例中,自主机器人的定位装置的定位精度例如可以达到分米级、厘米级甚至更高。比如,在一示例性实施例中,自主机器人的定位装置可以为基于差分定位技术(例如RTK(Real-time kinematic,实时动态)载波相位差分技术)的差分定位装置等。
在本说明书一些实施例中,工作区域的初始地图一般为平面地图,该平面地图展示了工作区域的初始边界(即轮廓)、工作区域内及其周边的障碍物分布等,当然,初始地图还配置有各个位置点的坐标信息。例如,在图5所示的示例性实施例中,地块m是由多条道路分割而成的大致为矩形的地块,其内分布有房屋、泳池等障碍物,地块m的南侧与河流毗邻。
在上文已经阐明,地块数据库中各个地块的初始地图的精度有限,因此,为了提高自主机器人的地图创建精度,需要对初始边界进行边界探测,以根据实测边界优化初始边界。在本说明书一些实施例中,基于自主机器人配置的边界探测装置(例如图6中的白色圆形标记)和定位装置(例如图6中的黑色三角形标记)等提供的数据,实测边界获取模块22可以使自主机器人沿工作区域的初始边界执行边界探测(例如图6所示),以获取实测边界。在一些实施例中,所述边界探测装置可以包括但不限于视觉传感器、多光谱传感器、电容式接近传感器和雷达传感器中的一种或多种。为了能使自主机器人可以适用于各种工况的边界探测,视觉传感器(或视觉传感器+其他边界传感器的组合)为较佳的选择。
为了便于理解,下面以自主机器人沿初始边界执行边界探测为例,介绍实测边界 获取模块22的边界探测过程。但是,本领域技术人员可以理解,在不脱离本说明书的精神和原理的前提下,自主机器人也可以采用其他任何合适的方式执行边界探测。需要指出的是,本说明书中的沿初始边界可以是指自主机器人在工作区域内以靠近初始边界的方式移动,且其移动轨迹的总体趋势是沿初始边界的。此外,自主机器人在进行边界探测的过程可以关闭其作业执行机构,以提高边界探测安全。例如,以智能割草机为例,可以在边界探测的过程关闭刀盘。
在本说明书一些实施例中,所述实测边界获取模块22可以根据所述自主机器人的边界探测装置采集的探测数据,判断是否探测到边界点。例如以智能割草机为例,智能割草机的工作区域内通常为草地,而草地边缘一般为道路、河流、栅栏或围墙等。因此,通过对智能割草机的作为边界探测装置的视觉传感器采集的图像进行图像识别,可以判断是否探测到边界。
当探测到边界点时,实测边界获取模块22可以记录此时所述自主机器人的航向角及所述自主机器人的定位装置输出的第一坐标;然后可以根据所述第一坐标、所述航向角、所述定位装置与所述边界探测装置的距离以及所述探测数据,确定所述边界点的第二坐标。由此,根据各个被探测到的边界点的第二坐标就可以形成实测边界。
例如,在图7所示的示例性实施例中,A点为自主机器人的定位装置所在位置,B点为自主机器人的边界探测装置所在位置,自主机器人的航向如图7中的带箭头虚线方向所示,此时记录的自主机器人的航向角为θ,记录的A点坐标为(x
0,y
0)。由于定位装置和边界探测装置都是预先固定在自主机器人上的,距离AB(即图7中的c的长度)已知。因此,根据航向角θ和A点坐标(x
0,y
0)可以计算出B点坐标(x
1,y
1),由于自主机器人的边界探测装置可以测得其与被探测到的边界点(即图7中的C点)的距离BC(即图7中的a的长度)。例如,当边界探测装置为雷达传感器时,通过雷达传感器可以测得距离BC;再如,当边界探测装置为单个视觉传感器时,基于单目视觉定位测距原理也可以测得距离BC。由于∠ABC为直角已知,基于∠ABC、边长a和边长c,可以计算得到距离AC(即图7中的b的长度)。则根据以下公式可以计算得到C点坐标(x
2,y
2):
如此,依次递推,通过上述方式可以获得每个被探测到的边界点的第二坐标,然后根据这些边界点的第二坐标就可以形成实测边界。
如此,初始边界修正模块23就可以根据所述实测边界修正所述初始边界,即初始边界修正模块23可以用所述实测边界替换所述初始边界,从而实现对工作区域的初始地图优化,即获得了更为精确的工作地图。例如,在图9所示的示例性实施例中,自主机器人沿工作区域(即地块m)的初始边界(参见图9中的粗实线所示的矩形)进行边界探测,获得了如图9中虚线所围成的实测边界。结合图10所示,用实测边界替换初始边界,即可得到更为精确的工作区域边界(参见图10中的粗虚线所围成的闭合边界)。
为了保证自主机器人的作业安全,所述实测边界获取模块22在所述使自主机器人沿所述初始边界执行边界探测的过程中,还可以使所述自主机器人与所述初始边界保持预设的安全距离。
在一些情况下,工作区域边界处的部分或全部可能为道路、河道、泳池或悬崖等危险区域。为了防止自主机器人进入危险区域,在使自主机器人沿所述初始边界执行边界探测的过程中,所述实测边界获取模块22可以使所述自主机器人与所述初始边界中的危险边界段保持稍大一点的安全距离(比如比默认的安全距离大30%),例如图8中的d2所示。其中,危险边界段可以是指初始边界中与危险区域相邻的边界段。
在另一些情况下,工作区域边界处的部分或全部也可能为栅栏、围墙等非危险区域。因此,在保证自主机器人的作业安全的前提下,为了兼顾自主机器人的作业覆盖率,对于初始边界中的非危险边界段,在使自主机器人沿所述初始边界执行边界探测的过程中,所述实测边界获取模块22可以使所述自主机器人与非危险边界段保持稍小一点的安全距离(比如为比默认的安全距离),例如图8中的d1所示。其中,非危险边界段可以是指初始边界中与非危险区域相邻的边界段,或者初始边界中不与危险区域相邻的边界段。
在本说明书一些实施例中,工作区域边界处的危险区域识别可以由所述实测边界获取模块22通过图像识别的方式自动确定,也可以由用户指定,具体可以根据需要确定,本说明书对此不做限定。在确定危险区域和非危险区域的基础上,所述实测边界获取模块22可以自动为危险区域和非危险区域设定安全距离。
在本说明书一些实施例中,为了进一步提高边界探测的安全性。在所述使自主机 器人沿所述初始边界执行边界探测的过程中,所述实测边界获取模块22还可以确认待探测边界段是否为危险边界段,以便于对危险边界段实施现场人工监视。由于在执行边界探测之前,已划分好初始边界中的危险边界段和非危险边界段,因此,基于这种划分结果,所述实测边界获取模块22可以确认待探测边界段是否为危险边界段。
在本说明书一些实施例中,当待探测边界段为危险边界段时,所述实测边界获取模块22可以进一步确认探测现场是否有人工监视。当确认探测现场有人工监视时,所述实测边界获取模块22可以使所述自主机器人开始或继续执行边界探测。当确认探测现场无人工监视,所述实测边界获取模块22使所述自主机器人停机,并还可以进行报警,以进一步提高自主机器人的边界探测安全性。
在本说明书一些实施例中,所述实测边界获取模块22可以通过任何合适的方式探测现场是否有人工监视,本说明对此不作限定,具体可以根据需要选择。例如,在一些实施例中,可以基于所述自主机器人的无线通信模块是否接收到指定设备发射的无线信号,可以确认探测现场是否有人工监视。所述指定设备由现场监视人员携带,并可以持续对外发射无线信号,因此当收到指定设备发射的无线信号时,则可以推测探测现场有人工监视。所述无线信号例如可以为蓝牙、WiFi等。再如,在一些实施例中,还可以基于所述自主机器人的成像探测器是否在指定探测半径内探测到人体信号,确认探测现场是否有人工监视。其中,所述成像探测器例如可以为红外热成像探测器或成像雷达等。
上述的指定设备可是任何具有上述功能的便携式设备,例如可以包括但不限于智能手机、平板电脑、笔记本电脑、数字助理或智能可穿戴设备等。
为了描述的方便,描述以上装置或模块时以功能分为各种单元分别描述。当然,在实施本说明书时可以把各单元的功能在同一个或多个软件和/或硬件中实现。
与上述自主机器人的地图创建装置对应,本说明书还提供了自主机器人的地图创建方法。参考图11所示,本说明书一些实施例的自主机器人的地图创建方法可以包括如下步骤:
S111、获取从目标数据库中选定的初始地图,所述初始地图包括初始边界。
S112、使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足预设条件。
S113、根据所述实测边界修正所述初始边界。
本说明书一些实施例的自主机器人的地图创建方法中,所述获取从目标数据库中选定的初始地图可以包括:
向服务器发送携带地块标识信息的地块选择请求;
接收携带地块地图的地块选择响应,并将所述地块地图作为初始地图;所述地块地图由所述服务器根据所述地块标识信息从地块数据库中匹配得到。
本说明书一些实施例的自主机器人的地图创建方法中,所述地块标识信息可以包括以下中的任意一种:
地块标识符;
与所述地块对应的位置地址。
本说明书一些实施例的自主机器人的地图创建方法中,所述获取从目标数据库中选定的初始地图可以包括:
接收第三方发送的初始地图;所述初始地图由所述第三方从地块数据库选取得到。
本说明书一些实施例的自主机器人的地图创建方法中,所述获取从目标数据库中选定的初始地图还可以包括:
接收第三方发送的初始地图;所述初始地图由所述第三方从地块数据库选取得到。
本说明书一些实施例的自主机器人的地图创建方法中,所述使自主机器人执行边界探测,可以包括:
根据所述自主机器人的边界探测装置采集的探测数据,判断是否探测到边界点;
当探测到边界点时,记录此时所述自主机器人的航向角及所述自主机器人的定位装置输出的第一坐标;
根据所述第一坐标、所述航向角、所述定位装置与所述边界探测装置的距离以及所述探测数据,确定所述边界点的第二坐标;
根据各个被探测到的边界点的第二坐标形成实测边界。
本说明书一些实施例的自主机器人的地图创建方法中,所述边界探测装置包括视觉传感器。
本说明书一些实施例的自主机器人的地图创建方法还可以包括:
在所述使自主机器人执行边界探测的过程中,使所述自主机器人与所述初始边界保持预设的安全距离。
本说明书一些实施例的自主机器人的地图创建方法中,所述使所述自主机器人与 所述初始边界保持预设的安全距离,可以包括:
使所述自主机器人与所述初始边界中的危险边界段保持第一安全距离。
本说明书一些实施例的自主机器人的地图创建方法中,所述使所述自主机器人与所述初始边界保持预设的安全距离,还可以包括:
使所述自主机器人与所述初始边界中的非危险边界段保持第二安全距离,所述第二安全距离小于所述第一安全距离。
本说明书一些实施例的自主机器人的地图创建方法还可以包括:
在所述使自主机器人执行边界探测的过程中,确认待探测边界段是否为危险边界段;
当所述待探测边界段为危险边界段时,确认探测现场是否有人工监视;
当确认探测现场有人工监视时,使所述自主机器人开始或继续执行边界探测。
本说明书一些实施例的自主机器人的地图创建方法还可以包括:
当确认探测现场无人工监视,使所述自主机器人停机。
本说明书一些实施例的自主机器人的地图创建方法还可以包括:
当所述待探测边界段为非危险边界段时,使所述自主机器人开始或继续执行边界探测。
本说明书一些实施例的自主机器人的地图创建方法中,所述确认探测现场是否有人工监视,可以包括以下中的任意一种或多种:
基于所述自主机器人的无线通信模块是否接收到指定设备发射的无线信号,确认探测现场是否有人工监视;所述指定设备由现场监视人员携带;
基于所述自主机器人的成像探测器是否在指定探测半径内探测到人体信号,确认探测现场是否有人工监视。
本说明书一些实施例的自主机器人的地图创建方法中,所述根据所述实测边界修正所述初始边界,可以包括:
用所述实测边界替换所述初始边界。
虽然上文描述的过程流程包括以特定顺序出现的多个操作,但是,应当清楚了解,这些过程可以包括更多或更少的操作,这些操作可以顺序执行或并行执行(例如使用并行处理器或多线程环境)。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流 程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁盘式存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性 的包含,从而使得包括一系列要素的过程、方法、或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个......”限定的要素,并不排除在包括所述要素的过程、方法或者设备中还存在另外的相同要素。
本领域技术人员应明白,本说明书的实施例可提供为方法、系统或计算机程序产品。因此,本说明书可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本说明书可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本说明书可以在由计算机执行的计算机可执行指令的一般上下文中描述,例如程序模块。一般地,程序模块包括执行特定任务或实现特定抽象数据类型的例程、程序、对象、组件、数据结构等等。也可以在分布式计算环境中实践本说明书,在这些分布式计算环境中,由通过通信网络而被连接的远程处理设备来执行任务。在分布式计算环境中,程序模块可以位于包括存储设备在内的本地和远程计算机存储介质中。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分互相参见即可,每个实施例重点说明的都是与其他实施例的不同之处。尤其,对于系统实施例而言,由于其基本相似于方法实施例,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所述仅为本说明书的实施例而已,并不用于限制本说明书。对于本领域技术人员来说,本说明书可以有各种更改和变化。凡在本说明书的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本说明书的权利要求范围之内。
Claims (20)
- 一种自主机器人的地图创建方法,其特征在于,包括:获取从目标数据库中选定的初始地图,所述初始地图包括初始边界;使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足预设条件;根据所述实测边界修正所述初始边界;所述获取从目标数据库中选定的初始地图,包括:向服务器发送携带地块标识信息的地块选择请求;接收携带地块地图的地块选择响应,并将所述地块地图作为初始地图;所述地块地图由所述服务器根据所述地块标识信息从地块数据库中匹配得到;所述地块标识信息包括以下中的任意一种:地块标识符;与所述地块对应的位置地址;所述使自主机器人执行边界探测,包括:根据所述自主机器人的边界探测装置采集的探测数据,判断是否探测到边界点;当探测到边界点时,记录此时所述自主机器人的航向角及所述自主机器人的定位装置输出的第一坐标;根据所述第一坐标、所述航向角、所述定位装置与所述边界探测装置的距离以及所述探测数据,确定所述边界点的第二坐标;根据各个被探测到的边界点的第二坐标形成实测边界;所述边界探测装置包括视觉传感器;在所述使自主机器人执行边界探测的过程中,使所述自主机器人与所述初始边界保持预设的安全距离;所述使所述自主机器人与所述初始边界保持预设的安全距离,包括:使所述自主机器人与所述初始边界中的危险边界段保持第一安全距离、使所述自主机器人与所述初始边界中的非危险边界段保持第二安全距离,所述第二安全距离小于所述第一安全距离;。在所述使自主机器人执行边界探测的过程中,确认待探测边界段是否为危险边界段;当所述待探测边界段为危险边界段时,确认探测现场是否有人工监视;当确认探测现场有人工监视时,使所述自主机器人开始或继续执行边界探测;当确认探测现场无人工监视,使所述自主机器人停机;当所述待探测边界段为非危险边界段时,使所述自主机器人开始或继续执行边界探测;所述确认探测现场是否有人工监视,包括以下中的任意一种或多种:基于所述自主机器人的无线通信模块是否接收到指定设备发射的无线信号,确认探测现场是否有人工监视;所述指定设备由现场监视人员携带;基于所述自主机器人的成像探测器是否在指定探测半径内探测到人体信号,确认探测现场是否有人工监视;所述根据所述实测边界修正所述初始边界,包括:用所述实测边界替换所述初始边界。
- 如权利要求1所述的自主机器人的地图创建方法,其特征在于,所述获取从目标数据库中选定的初始地图,包括:接收第三方发送的初始地图;所述初始地图由所述第三方从地块数据库选取得到。
- 如权利要求1所述的自主机器人的地图创建方法,其特征在于,所述获取从目标数据库中选定的初始地图,包括:接收第三方发送的初始地图;所述初始地图由所述第三方从电子地图上选取得到。
- 一种自主机器人的地图创建装置,其特征在于,包括:初始边界获取模块,用于获取从目标数据库中选定的初始地图,所述初始地图包括初始边界;实测边界获取模块,用于使自主机器人执行边界探测,以获取实测边界;所述自主机器人的定位精度满足预设条件;初始边界修正模块,用于根据所述实测边界修正所述初始边界。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述获取从目标数据库中选定的初始地图,包括:向服务器发送携带地块标识信息的地块选择请求;接收携带地块地图的地块选择响应,并将所述地块地图作为初始地图;所述地块地图由所述服务器根据所述地块标识信息从地块数据库中匹配得到。
- 如权利要求5所述的自主机器人的地图创建装置,其特征在于,所述地块标识信息包括以下中的任意一种:地块标识符;与所述地块对应的位置地址。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述获取从目标数据库中选定的初始地图,包括:接收第三方发送的初始地图;所述初始地图由所述第三方从地块数据库选取得到。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述获取从目标数据库中选定的初始地图,包括:接收第三方发送的初始地图;所述初始地图由所述第三方从电子地图上选取得到。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述使自主机器人执行边界探测,包括:根据所述自主机器人的边界探测装置采集的探测数据,判断是否探测到边界点;当探测到边界点时,记录此时所述自主机器人的航向角及所述自主机器人的定位装置输出的第一坐标;根据所述第一坐标、所述航向角、所述定位装置与所述边界探测装置的距离以及所述探测数据,确定所述边界点的第二坐标;根据各个被探测到的边界点的第二坐标形成实测边界。
- 如权利要求9所述的自主机器人的地图创建装置,其特征在于,所述边界探测装置包括视觉传感器。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述实测边界获取模块还用于:在所述使自主机器人执行边界探测的过程中,使所述自主机器人与所述初始边界保持预设的安全距离。
- 如权利要求11所述的自主机器人的地图创建装置,其特征在于,所述使所述自主机器人与所述初始边界保持预设的安全距离,包括:使所述自主机器人与所述初始边界中的危险边界段保持第一安全距离。
- 如权利要求12所述的自主机器人的地图创建装置,其特征在于,所述使所述自主机器人与所述初始边界保持预设的安全距离,还包括:使所述自主机器人与所述初始边界中的非危险边界段保持第二安全距离,所述第二安全距离小于所述第一安全距离。
- 如权利要求11所述的自主机器人的地图创建装置,其特征在于,所述实测边界获取模块还用于:在所述使自主机器人执行边界探测的过程中,确认待探测边界段是否为危险边界段;当所述待探测边界段为危险边界段时,确认探测现场是否有人工监视;当确认探测现场有人工监视时,使所述自主机器人开始或继续执行边界探测。
- 如权利要求14所述的自主机器人的地图创建装置,其特征在于,所述实测边界获取模块还用于:当确认探测现场无人工监视,使所述自主机器人停机。
- 如权利要求14所述的自主机器人的地图创建装置,其特征在于,所述实测边界获取模块还用于:当所述待探测边界段为非危险边界段时,使所述自主机器人开始或继续执行边界探测。
- 如权利要求14所述的自主机器人的地图创建装置,其特征在于,所述确认探测现场是否有人工监视,包括以下中的任意一种或多种:基于所述自主机器人的无线通信模块是否接收到指定设备发射的无线信号,确认探测现场是否有人工监视;所述指定设备由现场监视人员携带;基于所述自主机器人的成像探测器是否在指定探测半径内探测到人体信号,确认探测现场是否有人工监视。
- 如权利要求4所述的自主机器人的地图创建装置,其特征在于,所述根据所述实测边界修正所述初始边界,包括:用所述实测边界替换所述初始边界。
- 一种自主机器人,其特征在于,所述自主机器人配置有权利要求4-18任意一项所述的地图创建装置。
- 一种计算机存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现权利要求1-15任意一项所述的地图创建方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/790,695 US20240192704A1 (en) | 2020-01-02 | 2020-11-11 | Map creating method and apparatus for autonomous robot, device, and storage medium |
EP20908702.2A EP4086722A4 (en) | 2020-01-02 | 2020-11-11 | METHOD AND APPARATUS FOR CREATING CARD FOR AUTONOMOUS ROBOT, DEVICE AND INFORMATION MEDIUM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010002699.0A CN113138593A (zh) | 2020-01-02 | 2020-01-02 | 自主机器人的地图创建方法、装置、设备及存储介质 |
CN202010002699.0 | 2020-01-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021135658A1 true WO2021135658A1 (zh) | 2021-07-08 |
Family
ID=76686418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/128027 WO2021135658A1 (zh) | 2020-01-02 | 2020-11-11 | 自主机器人的地图创建方法、装置、设备及存储介质 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240192704A1 (zh) |
EP (1) | EP4086722A4 (zh) |
CN (1) | CN113138593A (zh) |
WO (1) | WO2021135658A1 (zh) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104345734A (zh) * | 2013-08-07 | 2015-02-11 | 苏州宝时得电动工具有限公司 | 自动工作系统、自动行走设备及其控制方法 |
CN105843228A (zh) * | 2016-04-13 | 2016-08-10 | 上海物景智能科技有限公司 | 一种清洁机器人的地图共享方法及系统 |
CN106530946A (zh) * | 2016-11-30 | 2017-03-22 | 北京贝虎机器人技术有限公司 | 用于编辑室内地图的方法及装置 |
EP3167700A1 (de) | 2015-11-13 | 2017-05-17 | Robert Bosch Gmbh | Autonomes arbeitsgerät |
CN107239074A (zh) * | 2016-03-29 | 2017-10-10 | 苏州宝时得电动工具有限公司 | 自动工作系统及其工作区域的地图建立方法 |
CN108908331A (zh) * | 2018-07-13 | 2018-11-30 | 哈尔滨工业大学(深圳) | 超冗余柔性机器人的避障方法及系统、计算机存储介质 |
EP3474107A1 (en) | 2017-10-18 | 2019-04-24 | Kubota Corporation | Work area determination system for autonomous traveling work vehicle, the autonomous traveling work vehicle and work area determination program |
WO2019081135A1 (de) * | 2017-10-24 | 2019-05-02 | Robert Bosch Gmbh | Überwachungsvorrichtung, industrieanlage, verfahren zur überwachung sowie computerprogramm |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL113913A (en) * | 1995-05-30 | 2000-02-29 | Friendly Machines Ltd | Navigation method and system |
JP2007256149A (ja) * | 2006-03-24 | 2007-10-04 | Clarion Co Ltd | ナビゲーションシステム、及び、地図表示方法 |
JP6132659B2 (ja) * | 2013-02-27 | 2017-05-24 | シャープ株式会社 | 周囲環境認識装置、それを用いた自律移動システムおよび周囲環境認識方法 |
EP2848892B1 (en) * | 2013-09-13 | 2017-12-27 | Elektrobit Automotive GmbH | Technique for correcting digitized map data |
GB201419883D0 (en) * | 2014-11-07 | 2014-12-24 | F Robotics Acquisitions Ltd | Domestic robotic system and method |
CN105652864A (zh) * | 2014-11-14 | 2016-06-08 | 科沃斯机器人有限公司 | 自移动机器人构建地图的方法及利用该地图的作业方法 |
CN108106616B (zh) * | 2017-12-13 | 2020-07-28 | 深圳市艾特智能科技有限公司 | 一种自建导航地图的方法、系统及智能设备 |
CN108873912A (zh) * | 2018-08-21 | 2018-11-23 | 深圳乐动机器人有限公司 | 地图管理方法、装置、计算机设备和存储介质 |
CN109344214A (zh) * | 2018-09-07 | 2019-02-15 | 北京云迹科技有限公司 | 地图管理方法和机器人 |
CN109682368B (zh) * | 2018-11-30 | 2021-07-06 | 上海肇观电子科技有限公司 | 机器人及地图构建方法、定位方法、电子设备、存储介质 |
CN110567467A (zh) * | 2019-09-11 | 2019-12-13 | 北京云迹科技有限公司 | 基于多传感器的地图构建方法、装置及存储介质 |
-
2020
- 2020-01-02 CN CN202010002699.0A patent/CN113138593A/zh active Pending
- 2020-11-11 WO PCT/CN2020/128027 patent/WO2021135658A1/zh unknown
- 2020-11-11 US US17/790,695 patent/US20240192704A1/en active Pending
- 2020-11-11 EP EP20908702.2A patent/EP4086722A4/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104345734A (zh) * | 2013-08-07 | 2015-02-11 | 苏州宝时得电动工具有限公司 | 自动工作系统、自动行走设备及其控制方法 |
EP3167700A1 (de) | 2015-11-13 | 2017-05-17 | Robert Bosch Gmbh | Autonomes arbeitsgerät |
CN107239074A (zh) * | 2016-03-29 | 2017-10-10 | 苏州宝时得电动工具有限公司 | 自动工作系统及其工作区域的地图建立方法 |
CN105843228A (zh) * | 2016-04-13 | 2016-08-10 | 上海物景智能科技有限公司 | 一种清洁机器人的地图共享方法及系统 |
CN106530946A (zh) * | 2016-11-30 | 2017-03-22 | 北京贝虎机器人技术有限公司 | 用于编辑室内地图的方法及装置 |
EP3474107A1 (en) | 2017-10-18 | 2019-04-24 | Kubota Corporation | Work area determination system for autonomous traveling work vehicle, the autonomous traveling work vehicle and work area determination program |
WO2019081135A1 (de) * | 2017-10-24 | 2019-05-02 | Robert Bosch Gmbh | Überwachungsvorrichtung, industrieanlage, verfahren zur überwachung sowie computerprogramm |
CN108908331A (zh) * | 2018-07-13 | 2018-11-30 | 哈尔滨工业大学(深圳) | 超冗余柔性机器人的避障方法及系统、计算机存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4086722A4 |
Also Published As
Publication number | Publication date |
---|---|
US20240192704A1 (en) | 2024-06-13 |
EP4086722A1 (en) | 2022-11-09 |
CN113138593A (zh) | 2021-07-20 |
EP4086722A4 (en) | 2023-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2022203756B2 (en) | Unmanned aircraft structure evaluation system and method | |
US11943679B2 (en) | Mobile device navigation system | |
CN113296495B (zh) | 自移动设备的路径形成方法、装置和自动工作系统 | |
US10386260B2 (en) | Leak detection for fluid distribution networks using hyperspectral imaging | |
EP3186685B1 (en) | Three-dimensional elevation modeling for use in operating agricultural vehicles | |
US10262437B1 (en) | Decentralized position and navigation method, device, and system leveraging augmented reality, computer vision, machine learning, and distributed ledger technologies | |
US10210411B2 (en) | Method and apparatus for establishing feature prediction accuracy | |
US20170102467A1 (en) | Systems, methods, and apparatus for tracking an object | |
US20170330032A1 (en) | Building footprint extraction apparatus, method and computer program product | |
WO2018204552A1 (en) | Gps offset calibration for uavs | |
US20140324630A1 (en) | System and method for delivering relevant, location-specific gis information to a mobile device | |
CN110895408B (zh) | 一种自主定位方法、装置及移动机器人 | |
US12106391B2 (en) | Property measurement with automated document production | |
TWI742268B (zh) | 用於映射位置偵測至圖形表示之方法、裝置及系統 | |
WO2021135658A1 (zh) | 自主机器人的地图创建方法、装置、设备及存储介质 | |
US20230408479A1 (en) | Systems and methods for enhancing water safety using sensor and unmanned vehicle technologies | |
EP4242585A2 (en) | Surveying assistance system, information display terminal, surveying assistance method, and surveying assistance program | |
TW201814246A (zh) | 影像辨識確定座標與導航裝置 | |
WO2022259750A1 (ja) | 森林用の情報処理装置、情報処理システム、及び、情報処理方法 | |
Wang et al. | An eye gaze-aided virtual tape measure for smart construction | |
WO2021074632A2 (en) | Methods and systems for managing infrastructure networks | |
SE2251486A1 (en) | Method and system for defining a lawn care area | |
EP2869093A2 (en) | Detection of incursion of proposed excavation zones into buried assets |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20908702 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020908702 Country of ref document: EP Effective date: 20220802 |