WO2021027967A1 - 一种路线确定方法、可行进设备、和存储介质 - Google Patents

一种路线确定方法、可行进设备、和存储介质 Download PDF

Info

Publication number
WO2021027967A1
WO2021027967A1 PCT/CN2020/109619 CN2020109619W WO2021027967A1 WO 2021027967 A1 WO2021027967 A1 WO 2021027967A1 CN 2020109619 W CN2020109619 W CN 2020109619W WO 2021027967 A1 WO2021027967 A1 WO 2021027967A1
Authority
WO
WIPO (PCT)
Prior art keywords
target point
traveling
area
information
environment
Prior art date
Application number
PCT/CN2020/109619
Other languages
English (en)
French (fr)
Inventor
王游
陈子冲
Original Assignee
纳恩博(常州)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 纳恩博(常州)科技有限公司 filed Critical 纳恩博(常州)科技有限公司
Publication of WO2021027967A1 publication Critical patent/WO2021027967A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • This application relates to travel technology, in particular to a route determination method, feasible travel equipment, and computer storage media.
  • the accessible device can travel from its location to the destination according to the route indicated by the map.
  • the map is usually a global map, which is established with the world coordinate system as the coordinate system, and the position of each object appearing in the world coordinate system is the absolute coordinate of the corresponding object on the earth.
  • the establishment of a global map requires at least personnel to go to various places around the world to collect sceneries, and a series of processing such as systematic entry, editing, and synthesis of the collected information, which consumes a certain amount of manpower and material resources.
  • This kind of global map is of greater significance to larger travel equipment such as buses, taxis, and private cars.
  • the embodiments of the present application provide a route determination method, feasible equipment, and computer storage medium, which can at least avoid the waste of map resources caused by the use of global maps for route determination and map creation in advance. , The problem of high requirements for software and hardware resources of feasible equipment and low navigation efficiency.
  • the embodiment of the present application provides a route determination method, the method includes:
  • first information of the target point being at least characterized as the position of the target point in the traveling image, wherein the traveling image is obtained by collecting the traveling environment of the traveling device;
  • the first information of the target point is processed to obtain second information of the target point, where the second information is at least characterized as the position of the target point in a local map established based on the traveling environment ;
  • the method further includes:
  • the determining the route of the accessible device to the target point based at least on the first actual position of the target point in the travel environment includes:
  • the route is determined based on the first actual position of the target point in the traveling environment and the second actual position of the feasible device in the traveling environment.
  • the method further includes:
  • the obtaining the first information of the target point includes:
  • the traveling image displays at least a first area
  • the first area is characterized as an area in the traveling environment where the traveling device can collect the traveling image; the target point is located in the Within the first area.
  • the traveling image displays at least a second area, and the second area is characterized as an area in the first area where a road condition meets a predetermined condition; the target point is located in the second area.
  • the embodiment of the present application provides a feasible equipment, including:
  • the first obtaining unit is configured to obtain first information of the target point, the first information being at least characterized as the position of the target point in the traveling image of the feasible traveling device; wherein the traveling image is obtained by collecting the feasible traveling Derived from the traveling environment of the equipment;
  • the processing unit is configured to process the first information of the target point to obtain second information of the target point, where the second information is at least characterized as being in a local map established based on the traveling environment State the location of the target point;
  • a second obtaining unit configured to obtain the first actual position of the target point in the traveling environment based on the second information of the target point
  • the determining unit is configured to determine a route of the accessible device to the target point based at least on the first actual position of the target point in the traveling environment.
  • the second obtaining unit is further configured to:
  • the determining unit is configured to determine the route based on the first actual position of the target point in the traveling environment and the second actual position of the feasible device in the traveling environment.
  • the device further includes a sending unit configured to send the traveling image obtained by collecting the traveling environment of the feasible device to a remote server;
  • the first obtaining unit is configured to receive the first information of the target point from the remote server.
  • the traveling image displays at least a first area
  • the first area is characterized as an area in the traveling environment where the traveling device can collect the traveling image; the target point is located in the Within the first area.
  • the traveling image displays at least a second area, and the second area is characterized as an area in the first area where a road condition meets a predetermined condition; the target point is located in the second area.
  • the embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the foregoing method are implemented.
  • the embodiment of the present application provides a feasible device including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, and the processor implements the steps of the foregoing method when the program is executed.
  • the route determination method, feasible advancement equipment and computer storage medium of the embodiments of the present application include: obtaining first information of a target point, the first information being at least characterized as the position of the target point in the traveling image, wherein: The traveling image is obtained by collecting the traveling environment of the feasible traveling device; the first information of the target point is processed to obtain the second information of the target point, and the second information is at least The location of the target point in the local map established based on the traveling environment; the first actual position of the target point in the traveling environment is obtained based on the second information of the target point; at least based on the target The first actual position of the point in the travel environment is determined, and the route of the feasible travel equipment to the target point is determined.
  • the embodiments of the present application can at least avoid the problems of waste of map resources caused by the use of a global map for route determination and high requirements for software and hardware resources of feasible equipment.
  • the problem of low navigation efficiency of non-global positioning in related technologies can also be avoided.
  • FIG. 1 is a schematic diagram of the implementation process of the first embodiment of the route determination method provided by this application;
  • FIG. 2 is a schematic diagram of the implementation process of the second embodiment of the route determination method provided by this application.
  • FIG. 3 is a schematic diagram of a target point and a robot in a traveling image provided by an embodiment of the application;
  • FIG. 4 is a first schematic diagram of a visible area according to an embodiment of the application.
  • FIG. 5 is a second schematic diagram of a visible area according to an embodiment of the application.
  • FIG. 6 is a third schematic diagram of the visible area of an embodiment of the application.
  • FIG. 7 is a schematic diagram of the position of the target point in the robot coordinate system according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of the position of the target point in the image coordinate system according to an embodiment of the application.
  • FIG. 9 is a schematic diagram of the composition structure of an embodiment of a feasible equipment of this application.
  • FIG. 10 is a schematic diagram of the hardware structure of an embodiment of a feasible device of this application.
  • the embodiments of the present application can at least solve the problem of using a global map for navigation for travel equipment such as robots, scooters, and balance vehicles in related technologies, resulting in waste of map resources and high requirements for software and hardware resources of travel equipment. It can also solve the problem of low navigation efficiency of non-global positioning in related technologies.
  • the feasible equipment involved in the embodiments of the present application is any reasonable equipment capable of traveling, such as robots, balance vehicles, scooters, balance wheels and other equipment.
  • it is a robot.
  • This application provides a first embodiment of a route determination method, which is applied to feasible equipment, as shown in FIG. 1, the method includes:
  • Step 101 Obtain first information of the target point, where the first information is at least characterized as the position of the target point in the traveling image; wherein the traveling image is obtained by collecting the traveling environment of the traveling device;
  • Step 102 Process the first information of the target point to obtain second information of the target point, where the second information is at least characterized as the target in a local map established based on the traveling environment Point location
  • Step 103 Obtain the first actual position of the target point in the traveling environment based on the second information of the target point;
  • Step 104 Based on at least the first actual position of the target point in the travel environment, determine a route for the feasible travel device to reach the target point.
  • the entity that performs steps 101 to 104 is a viable device.
  • the method further includes step 105: controlling the feasible device to travel to the target point according to the route.
  • the position of the target point in the local map established based on the traveling environment is determined, and the target point is obtained based on the position of the target point in the local map.
  • the actual location of the point in the real travel environment and then determine the route between the feasible travel equipment and the target point.
  • the route from the feasible equipment to the target point is a solution for determining the route from the feasible equipment to the target point by the position of the target point in the traveling image and local map. It can be understood that, compared with the environment represented by the global map, the traveling environment in which the accessible device is located is the local environment, and the map established based on the local environment is the local map, and the local map is based on local features, specifically the vision of the traveling environment A solution based on the location of the target point based on a local map, such as image features.
  • This solution based on the local map to locate the location of the target point can at least avoid the waste of map resources caused by the use of a global map for route determination and the high requirements for software and hardware resources of feasible equipment. And based on the local features to locate the target point in the real environment, it is easy to implement in engineering, and there is no need to build maps and points in advance, and the navigation efficiency is high.
  • the local positioning scheme of the embodiment of the present application is simple and easy to implement, and is more suitable for small travel equipment such as robots, balance vehicles, scooters, balance wheels and the like.
  • This application provides a second embodiment of a route determination method, which is applied to feasible equipment, as shown in FIG. 2, the method includes:
  • Step 201 Obtain first information of the target point, the first information at least characterizing the position of the target point in the traveling image; wherein the traveling image is obtained by collecting the traveling environment of the traveling device;
  • Step 202 Process the first information of the target point to obtain second information of the target point, where the second information is at least characterized as the target in a local map established based on the traveling environment Point location
  • Step 203 Obtain the first actual position of the target point in the traveling environment based on the second information of the target point;
  • Step 204 Obtain a second actual position of the feasible device in the traveling environment
  • Step 205 Based on the first actual position of the target point in the traveling environment and the second actual position of the feasible device in the traveling environment, determine the route of the feasible device to the target point .
  • the entity that executes steps 201 to 205 is a viable device.
  • the method further includes step 206: controlling the feasible device to travel to the target point according to the route.
  • steps 202/203 and step 204 There is no strict sequence between steps 202/203 and step 204, and they can also be performed simultaneously.
  • the location positioning solution is simpler, easier to implement, easier to implement in engineering, and more suitable for robots, balance vehicles, scooters, balance wheels, etc.
  • small travel equipment It can avoid the waste of map resources caused by the use of global maps for route determination, and the high requirements for software and hardware resources of feasible equipment. No need to build maps and points in advance, and the navigation efficiency is high.
  • the aforementioned solution for obtaining the first information of the target point can be used by the device to select the target point in the collected traveling image, or the device can send the collected traveling image, such as sending to
  • the peer device such as a remote server (referred to as server); the server selects the target point in the traveling image, and feedbacks the selection result to the feasible traveling device, that is, the traveling device can obtain the target point by receiving the first information of the target point.
  • the position of the target point of the traveling image In layman's terms, in the embodiments of this application, the feasible device can complete the route determination between the feasible device and the target point by itself, and it can also complete the route determination between the feasible device and the target point through interaction with other devices such as servers. .
  • the server is required to perform the selection of the target point through the traveling image.
  • This kind of target point is selected by the server, and the route between the target point and the feasible equipment is determined by the feasible equipment.
  • It is a new type of route determination scheme through interaction, which makes the route between the feasible equipment and the server.
  • the definite plan is more novel.
  • the traveling image displays at least a first area, and the first area is characterized as an area in the traveling environment where the traveling device can collect the traveling image; the target point is located Within the first area. Further, the traveling image at least displays a second area, and the second area is characterized as an area in the first area where a road condition meets a predetermined condition; the target point is located in the second area. It can be understood that, in the embodiments of the present application, the area in the traveling environment where the robot can collect the traveling image is called the visible area. In order to facilitate the selection of the target point, the visible area is displayed in the traveling image to see from the visible area.
  • the target point is selected in the area to ensure the accuracy of the target point selection, that is, the selected target point can be any point in the visible area.
  • the visible area includes areas with good road conditions and areas with poor road conditions.
  • the road conditions in the visible area can be analyzed based on the traveling image to confirm which road conditions are Good areas, which are the areas with poor road conditions.
  • the traveling image the areas of these two road conditions are represented separately, and the selected target point can be located in an area with good road conditions or an area with poor road conditions, which can be flexibly set according to the specific situation.
  • the road condition that meets the predetermined condition may be an area with good road conditions in the visible area, or an area with poor road conditions in the visible area.
  • the step 105 or 206 may be: according to the road condition of the visible area where the target point is located, control the feasible device to travel to the target point according to the route. For example, if the target point is in an area with good road conditions in the visible area, the viable equipment can be appropriately accelerated to reach the destination (target point) as soon as possible. If the target point is in an area with poor road conditions in the visible area, you can slow down appropriately to ensure driving safety.
  • the route between the robot and the target point is determined through the interaction between the robot and the server.
  • the global map usually needs to be established in advance, which requires a large amount of mapping work and occupies a lot of computing resources, which is not suitable for robots, balance cars, balance wheels, etc. This relatively small travel equipment.
  • the navigation scheme based on non-global positioning in the related technology also needs to build maps in advance, which has low navigation efficiency and is difficult to implement.
  • the server can operate the robot by remote control, but this control method is not efficient.
  • the robot can also perform autonomous navigation without being controlled by the server. For this kind of navigation, there may often be situations that cannot be handled.
  • the following solutions of the embodiments of the present application can at least solve the above problems existing in related technologies.
  • a collection device for collecting the traveling environment of the robot is set in advance at a fixed position of the robot, such as above the head.
  • the collection device can be any type of camera, such as a fisheye camera, a depth camera, Vision camera etc.
  • the robot can collect the traveling environment in real time through a collection device such as a fisheye camera.
  • the traveling environment collected by the collecting device is the environment within the collecting angle of the collecting device, which is equivalent to the traveling environment in the visible area of the robot.
  • the robot According to the collected traveling environment in the visible area of the robot, the robot itself can obtain information such as the direction and distance of each object in the visible area relative to the robot, thereby building a map as shown in Figure 7.
  • the traveling environment of the robot is the local environment
  • the map established based on the local environment is called the local map
  • the coordinate system where the local map is located is the robot coordinate system.
  • the local map may represent the actual positional relationship between each object in the real traveling environment where the robot is currently located and the robot after a certain zoom, such as zooming out, and represent each object in the real traveling environment within the visible area.
  • the distance and direction relationship between the object and the robot are represented by two-dimensional coordinates.
  • the robot also needs to establish a physical coordinate system for the robot's current traveling environment, and express the distance and direction between each object in the visible area of the real current traveling environment and the feasible equipment in the physical coordinate system, which is understandable
  • the physical coordinate system of the embodiment of the present application represents the absolute coordinates of each object in the visible area under the current traveling environment of the robot, which is established based on the current traveling environment, considering The current traveling environment is a local environment, so the physical coordinate system based on the local environment can also be regarded as a local map.
  • the distance and direction relationship between each object and the robot in the current traveling environment in the physical coordinate system of the embodiment of the present application has a certain mapping relationship with the distance and direction between the corresponding object and the robot in the map as shown in FIG. 7 If the object A is 10m away from the robot in the physical coordinate system of the embodiment of this application, is located in the north direction, and the zoom ratio is 1000:1 (10m in the real environment is represented by 1cm on the map), then the robot is established The coordinate point representing the object A in the local map is located in front of the coordinate point representing the robot, and the distance between the coordinate point representing the robot is 1 cm. It can be understood that the content represented in the robot coordinate system is the positional relationship of various objects relative to the robot in the visible area in the current traveling environment of the robot.
  • two local maps need to be established, one of which is a map in the robot coordinate system, which represents the relationship between each object and the robot in the visible area of the current traveling environment. Relative position relationship.
  • the other is a map in the physical coordinate system, which represents the position of each object (including the robot) in the visible area of the current traveling environment in the current real environment.
  • the robot collects the traveling environment in the visible area through a collection device such as a fisheye camera, obtains a traveling image, and sends the traveling image to the server.
  • the server receives and displays the traveling image (as shown in Figure 3).
  • the coordinate system shown in Figure 8 is established based on the traveling image server, so that the server can know the position of the target point selected by the operator in the traveling image .
  • the coordinate system established by the server is called the image coordinate system. It can be understood that the image coordinate system represents the position relationship between each object in the visible area of the current traveling environment and the robot through the traveling image (the position relationship is also a reduction of the actual position relationship).
  • the operator can observe the traveling environment of the robot through the traveling image as shown in FIG. 3, and select a target point in the traveling image as the destination for the robot to travel.
  • the traveling image received by the server is an image located in the visible area of the robot.
  • the range of the visible area is usually affected by the setting position, height, and wide angle of the collection device on the robot.
  • the visible area is also affected by the motion characteristics of the robot itself. Specifically, in the traveling environment, due to the motion characteristics of the robot itself, the angle or direction of part of its movement, at least part of its movement, will be restricted to a certain extent for the robot.
  • the angle at which it can turn left and right will be limited, at least it cannot turn left 90°, 85° or right 90°, 85°, while for left turn 30° or 45° It will not be restricted. Due to the above motion characteristics, there are unreachable positions in the traveling environment, such as a 90° left turn position, and reachable positions such as a left turn 30° or 45° position. Based on this, in order to clearly express to the operator which positions the robot cannot reach and which positions the robot can reach in the traveling image currently collected by the robot, the robot sends the traveling image to the server while moving the position it can reach. The image is represented so that the operator can accurately select the point that the robot can reach as the target point through the representation.
  • the position that it can reach is usually shown as a fan-shaped area or a fan-shaped deformed area as shown in FIGS. 4 and 5, which is the embodiment of the application.
  • the visible area of the robot in-the area where the robot can collect the traveling image.
  • the server shows the current visible area of the robot in the traveling image to prompt the operator to select a target point in the visible area.
  • the mouse or finger is used to generate mouse operation or gesture operation at the position of object B, and select the position of object B as the target point to complete the operator who wants the robot to reach the position of object B Expectations.
  • the position of the object B selected by the operator through the traveling image is the position of the object B in the image coordinate system.
  • the object B needs to be Perform coordinate changes.
  • both the traveling image and the map established by the robot represent the positional relationship between each object in the traveling environment and the robot, but the coordinate systems used by the two are different, so at least object B needs to be in the image coordinate system
  • the position coordinates of object B are converted to the coordinates of object B in the robot coordinate system, and then the coordinates in the robot coordinate system are mapped to the physical coordinate system established by the embodiment of this application based on the actual traveling environment the robot is currently in. The actual location in the real travel environment.
  • the server detects the coordinates of the target point selected by the operator on the display screen.
  • the coordinates of the position of the object B in the moving image on the display screen are (700, 900).
  • (700, 900) is the coordinate representation in the image coordinate system.
  • the server converts the coordinates of the selected target point in the image coordinate system, and converts the coordinates to the robot coordinate system.
  • the coordinates of the target point in the robot coordinate system are (700/720, 900/1280), and calculate The result is sent to the robot, and the robot displays the calculation result in the robot coordinate system as shown in Figure 7.
  • the server sends the coordinates of the selected target point in the image coordinate system to the robot, and the robot performs the above conversion on the basis of knowing the display pixel (720 ⁇ 1280) of the server-side display screen to obtain the target point in the robot coordinate system
  • the coordinates (700/720, 900/1280).
  • the robot maps the obtained coordinates of the target point in the robot coordinate system to the physical coordinate system of the embodiment of the present application to obtain the position of the object B in the real traveling environment.
  • the robot learns its position in the real traveling environment, plans a route from its current position to the position of object B, and proceeds to the destination through autonomous navigation. Assuming that the object B is located at a distance of 500m from the northeast of the robot in the real traveling environment represented by the physical coordinate system, plan the route from the target point to the robot and proceed according to the planned route, such as the straight-line distance from the target point to the robot Travel on the shortest route.
  • the process of mapping the target point from the robot coordinate system to the physical coordinate system can be realized by referring to the foregoing mapping relationship, and will not be described in detail.
  • the robot can also analyze the road conditions in the current visual area based on the collected traveling images, distinguish the areas with good road conditions and poor areas in the visual area, and mark them in the traveling image. As shown in Figure 6, areas with good road conditions are marked in gray, and areas with poor road conditions are excluded from the visible area. To remind the operator that he can select the target point in an area with good road conditions, or select a target point in an area with poor road conditions, preferably in an area with good road conditions.
  • the advantage of this is that, on the one hand, the safety of the robot when traveling to the destination is ensured, and there will be no bumps or shocks.
  • the robot in the subsequent process of the robot driving to the destination, if the target point is located in the passable area with good road conditions, the robot can be controlled to accelerate to reach the destination as soon as possible. If the target point is in an area with poor road conditions in the passable area, you can drive at an appropriate speed to ensure safe driving.
  • the map established by the robot is established based on its current traveling environment.
  • the map established by the robot is established based on local features and further image features of the local environment.
  • the embodiment of the present application can establish a map for the traveling environment as the robot advances. It does not need to be established in advance, and the real-time travel can be established in real time, and the map is for the local area.
  • the map established for the environment is a local map. Simple and easy to implement, easier to implement in engineering, and more suitable for travel equipment such as robots and balance cars.
  • the operator selects the navigation destination through the real-time traveling image collected by the robot received by the server, which is a solution in which the server selects the destination based on the image.
  • the position of the target point selected by the operator through the traveling image in the image coordinate system finally obtains the position of the target point in the real traveling environment, which can avoid the waste of map resources caused by the use of global maps for route determination, and the possibility of equipment
  • the problem of high software and hardware resource requirements and large amount of work for establishing a map in advance can also achieve high-efficiency navigation.
  • the target point that the robot needs to reach does not need to be selected by the robot, and is specified by the server.
  • the server selects the traveling image collected by the robot.
  • This solution where the server selects the destination from the image is compared with the solution in the related technology in which the server always controls the robot to reach the destination. It can liberate the operator to a certain extent, and only the operator needs to select the target through the server. Just click, no need to control the robot all the time, which can greatly improve the user experience.
  • This solution of destination designation by the server can also enable the same operator to select the destination of different robots at different times, so that the same operator can control multiple robots.
  • the operator selects the target point from the passable area of the accessible equipment, which can further guarantee the accuracy of the target point selection, and thus can provide a certain guarantee for the accuracy of the route determination.
  • the viable device can navigate autonomously and travel to the destination.
  • the robot can also automatically send a notification message to the server to prompt the operator to solve it.
  • the robot can also detect obstacles on the planned travel route and automatically avoid obstacles. Reflects the functional diversity of the robot. It can be understood that the solution of the embodiment of the present application is not based on a global map, but based on a local map for navigation; it is also possible to select a navigation destination by clicking on a traveling image transmitted by the robot in real time.
  • the embodiment of the present application does not need to establish a map in advance, and only needs to establish a local map and collect traveling images according to the real-time traveling environment.
  • the device includes: a first obtaining unit 601, a processing unit 602, a second obtaining unit 603, and a determining unit 604; wherein,
  • the first obtaining unit 601 is configured to obtain first information of the target point, the first information being at least characterized as the position of the target point in the traveling image; wherein the traveling image is obtained by collecting the traveling environment of the traveling device Get
  • the processing unit 602 is configured to process the first information of the target point to obtain second information of the target point, where the second information is at least characterized as being in a local map established based on the traveling environment The position of the target point;
  • the second obtaining unit 603 is configured to obtain the first actual position of the target point in the traveling environment based on the second information of the target point;
  • the determining unit 604 is configured to determine a route of the accessible device to the target point based at least on the first actual position of the target point in the traveling environment.
  • the second obtaining unit 603 is further configured to:
  • the determining unit 604 is configured to determine the route based on the first actual position of the target point in the traveling environment and the second actual position of the feasible device in the traveling environment.
  • the device further includes a sending unit configured to send the traveling image obtained by collecting the traveling environment of the feasible device to a remote server;
  • the first obtaining unit 601 is configured to receive the first information of the target point from the remote server.
  • the traveling image at least displays a first area, and the first area is characterized as an area where the traveling device can collect the traveling image in the traveling environment; the target point is located at Within the first area. Further, the traveling image at least displays a second area, and the second area is characterized as an area in the first area where a road condition meets a predetermined condition; the target point is located in the second area.
  • the device further includes a traveling unit configured to control the feasible device to travel to the target point according to the route.
  • the feasible advancement equipment provided in the foregoing embodiment belongs to the same concept as the foregoing embodiment of the route determination method.
  • the aforementioned first obtaining unit 601, processing unit 602, second obtaining unit 603, and determining unit 604 can all be composed of digital signal processing (DSP), central processing unit (CPU), logic programming array (FPGA), controller (MCU), etc. to fulfill.
  • DSP digital signal processing
  • CPU central processing unit
  • FPGA logic programming array
  • MCU controller
  • An embodiment of the present application also provides a computer-readable storage medium on which a computer program is stored, which is characterized in that, when the program is executed by a processor, it is used to perform at least the steps of any one of the methods shown in FIGS. 1 to 8.
  • the computer-readable storage medium may specifically be a memory.
  • the memory may be the memory 72 shown in FIG. 10.
  • FIG. 10 is a schematic diagram of the hardware structure of a feasible device according to an embodiment of the application.
  • the feasible device includes: a communication component 73 for data transmission, at least one processor 71, and a processor 71 for storing 71 is a memory 72 for computer programs running on it.
  • the various components in the terminal are coupled together through the bus system 74.
  • the bus system 74 is used to implement connection and communication between these components.
  • the bus system 74 also includes a power bus, a control bus, and a status signal bus. However, for the sake of clarity, various buses are marked as the bus system 74 in FIG. 10.
  • the processor 71 executes at least the steps of any one of the methods shown in FIGS. 1 to 8 when executing the computer program.
  • the memory 72 may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the non-volatile memory can be a read only memory (ROM, Read Only Memory), a programmable read only memory (PROM, Programmable Read-Only Memory), an erasable programmable read only memory (EPROM, Erasable Programmable Read- Only Memory, Electrically Erasable Programmable Read-Only Memory (EEPROM, Electrically Erasable Programmable Read-Only Memory), magnetic random access memory (FRAM, ferromagnetic random access memory), flash memory (Flash Memory), magnetic surface memory , CD-ROM, or CD-ROM (Compact Disc Read-Only Memory); magnetic surface memory can be magnetic disk storage or tape storage.
  • the volatile memory may be random access memory (RAM, Random Access Memory), which is used as an external cache.
  • RAM random access memory
  • SRAM static random access memory
  • SSRAM synchronous static random access memory
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM enhanced -Type synchronous dynamic random access memory
  • SLDRAM SyncLink Dynamic Random Access Memory
  • direct memory bus random access memory DRRAM, Direct Rambus Random Access Memory
  • DRRAM Direct Rambus Random Access Memory
  • the memory 72 described in the embodiment of the present application is intended to include, but is not limited to, these and any other suitable types of memory.
  • the method disclosed in the foregoing embodiment of the present application may be applied to the processor 71 or implemented by the processor 71.
  • the processor 71 may be an integrated circuit chip with signal processing capabilities. In the implementation process, the steps of the above method can be completed by hardware integrated logic circuits in the processor 71 or instructions in the form of software.
  • the aforementioned processor 71 may be a general-purpose processor, a DSP, or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like.
  • the processor 71 may implement or execute various methods, steps, and logical block diagrams disclosed in the embodiments of the present application.
  • the general-purpose processor may be a microprocessor or any conventional processor.
  • the steps of the method disclosed in the embodiments of the present application may be directly embodied as being executed and completed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a storage medium, and the storage medium is located in the memory 72.
  • the processor 71 reads the information in the memory 72 and completes the steps of the foregoing method in combination with its hardware.
  • the accessible device may be implemented by one or more application specific integrated circuits (ASIC, Application Specific Integrated Circuit), DSP, programmable logic device (PLD, Programmable Logic Device), and complex programmable logic device (CPLD).
  • ASIC application specific integrated circuits
  • DSP digital signal processor
  • PLD programmable logic device
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • FPGA field-programmable gate array
  • controller microprocessor
  • MCU microprocessor
  • Microprocessor microprocessor
  • the disclosed device and method may be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms of.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units; Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functional units in the embodiments of the present application can all be integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
  • the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • a person of ordinary skill in the art can understand that all or part of the steps in the above method embodiments can be implemented by a program instructing relevant hardware.
  • the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: removable storage devices, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disks or optical disks, etc.
  • ROM read-only memory
  • RAM Random Access Memory
  • magnetic disks or optical disks etc.
  • the above-mentioned integrated unit of this application is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) executes all or part of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: removable storage devices, ROM, RAM, magnetic disks, or optical disks and other media that can store program codes.
  • the route determination scheme in the embodiment of the present application uses the position of the target point in the traveling image and the position in the local map established based on the traveling environment to determine the actual position of the target point in the real traveling environment, and thereby determine the actual position of the target point in the real traveling environment.
  • the route from the feasible equipment to the target point in the environment is a scheme for determining the route from the feasible equipment to the target point through the position of the target point in the traveling image and local map.
  • This solution based on the local map to locate the location of the target point can at least avoid the waste of map resources caused by the use of a global map for route determination and the high requirements for software and hardware resources of feasible equipment.
  • the local positioning scheme of the embodiment of the present application is simple and easy to implement, and is more suitable for small travel equipment such as robots, balance vehicles, scooters, balance wheels and the like.

Abstract

一种路线确定方法、可行进设备和计算机存储介质,所述方法包括:获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置,其中所述行进图像通过采集所述可行进设备的行进环境而得(S101);将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置(S102);基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置(S103);至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线(S104)。

Description

一种路线确定方法、可行进设备、和存储介质
相关申请的交叉引用
本申请基于申请号为201910755755.5、申请日为2019年08月15日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的内容在此以引入方式并入本申请。
技术领域
本申请涉及行进技术,具体涉及一种路线确定方法、可行进设备、和计算机存储介质。
背景技术
相关技术中,可行进设备可按照地图的指示的路线从其所处位置行进到目的地。其中,所述地图通常为全局地图,以世界坐标系为坐标系而建立,出现在世界坐标系中的各个物体的位置是相应物体在地球上的绝对坐标。通常,全局地图的建立至少事先需要人员到全球各个地方去采景、并将采集到的信息进行系统的录入、编辑、合成等一系列的处理,耗费一定的人力和物力。这种全局地图对于诸如公交车、出租车、私家车等较为大型的出行设备而言其存在的意义较大。对于机器人、滑板车、平衡车等这种相对小型的出行设备而言,使用全局地图进行导航,一方面存在地图资源的浪费,另一方面无疑向这种小型的出行设备的软硬件资源提出挑战。此外,相关技术中也存在有基于非全局定位的导航方案,这种方案也需要提前建立地图,并将实际环境中的特定位置与地图中的特定位置的特征信息做关联,作为导航的目标点发送给机器人,效率也比较低,在实现上也具有一定的难度。
发明内容
为解决现有存在的技术问题,本申请实施例提供一种路线确定方法、可行进设备、和计算机存储介质,至少可避免由于采用全局地图进行路线确定、提前建图而带来的地图资源浪费、对可行进设备的软硬件资源要求高、导航效率低的问题。
本申请实施例的技术方案是这样实现的:
本申请实施例提供一种路线确定方法,所述方法包括:
获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置,其中所述行进图像通过采集可行进设备的行进环境而得;
将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
上述方案中,所述方法还包括:
获得所述可行进设备在所述行进环境中的第二实际位置;
相应的,所述至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线,包括:
基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述路线。
上述方案中,所述方法还包括:
将采集所述可行进设备的行进环境而得的所述行进图像发送至远程服务器;
相应的,所述获得目标点的第一信息,包括:
接收来自所述远程服务器的目标点的第一信息。
上述方案中,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所述行进图像的区域;所述目标点位于所述第一区域内。
上述方案中,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。
本申请实施例提供一种可行进设备,包括:
第一获得单元,配置为获得目标点的第一信息,所述第一信息至少表征为在可行进设备的行进图像中所述目标点的位置;其中,所述行进图像通过采集所述可行进设备的行进环境而得;
处理单元,配置为将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
第二获得单元,配置为基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
确定单元,配置为至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
上述方案中,所述第二获得单元,还配置为:
获得所述可行进设备在所述行进环境中的第二实际位置;
相应的,所述确定单元,配置为基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述路线。
上述方案中,所述设备还包括发送单元,配置为将采集所述可行进设 备的行进环境而得的所述行进图像发送至远程服务器;
相应的,所述第一获得单元,配置为接收来自所述远程服务器的所述目标点的第一信息。
上述方案中,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所述行进图像的区域;所述目标点位于所述第一区域内。
上述方案中,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。
本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现前述方法的步骤。
本申请实施例提供一种可行进设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现前述方法的步骤。
本申请实施例路线确定方法、可行进设备和计算机存储介质,所述方法包括:获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置,其中,所述行进图像通过采集所述可行进设备的行进环境而得;将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
本申请实施例至少可避免采用全局地图进行路线确定而带来的地图资源浪费、对可行进设备的软硬件资源要求高的问题。也可避免相关技术中的非全局定位的导航效率低的问题。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据提供的附图获得其他的附图。
图1为本申请提供路线确定方法的第一实施例的实现流程示意图;
图2为本申请提供路线确定方法的第二实施例的实现流程示意图;
图3为本申请实施例提供的行进图像中目标点与机器人的示意图;
图4为本申请实施例的可视区域的示意图一;
图5为本申请实施例的可视区域的示意图二;
图6为本申请实施例的可视区域的示意图三;
图7为本申请实施例的目标点在机器人坐标系中的位置示意图;
图8为本申请实施例的目标点在图像坐标系中的位置示意图;
图9为本申请的可行进设备实施例的组成结构示意图;
图10为本申请的可行进设备实施例的硬件构成示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚明白,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互任意组合。在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行。并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
本申请实施例至少能够解决相关技术中,对于机器人、滑板车、平衡车等出行设备,使用全局地图进行导航,导致地图资源浪费和对出行设备的软硬件资源要求高的问题。也能够解决相关技术中的非全局定位的导航效率低的问题。
可以理解,本申请实施例中涉及到的可行进设备为任何合理的能够进行行进的设备,如机器人、平衡车、滑板车、平衡轮等设备。优选为机器人。
本申请提供路线确定方法的第一实施例,应用于可行进设备中,如图1所示,所述方法包括:
步骤101:获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置;其中,所述行进图像通过采集可行进设备的行进环境而得;
步骤102:将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
步骤103:基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
步骤104:至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
执行步骤101~104的实体为可行进设备。作为一种可实现方式,所述方法还包括步骤105:控制所述可行进设备按照所述路线向所述目标点行进。
由前述方案可知,本申请实施例中,基于目标点在行进图像中的位置,确定目标点在基于行进环境而建立的局部地图中的位置,并基于目标点在 局部地图中的位置,得到目标点在真实行进环境中的实际位置,进而确定可行进设备与目标点之间的路线。这种确定路线的方式利用的是目标点在行进图像中的位置和基于行进环境而建立的局部地图中的位置,确定目标点在真实行进环境中的实际位置,并由此确定在真实环境中能够从可行进设备到目标点的路线,为一种通过目标点在行进图像、局部地图中的位置而确定从可行进设备到目标点的路线的方案。可以理解,与全局地图所表示的环境相比,可行进设备所处的行进环境为局部环境,基于局部环境而建立的地图为局部地图,且该局部地图基于局部特征、具体是行进环境的视觉如图像特征而实现的基于局部地图对目标点位置的定位的方案。这种基于局部地图对目标点位置的定位的方案至少可避免采用全局地图进行路线确定而带来的地图资源浪费、对可行进设备的软硬件资源要求高的问题。且基于局部特征进行目标点位置在真实环境下的定位,在工程上易于实现,不需要提前建图、建点,导航效率高。本申请实施例的局部定位方案、简单、易行,更适用于诸如机器人、平衡车、滑板车、平衡轮等这种小型出行设备中。
本申请提供路线确定方法的第二实施例,应用于可行进设备中,如图2所示,所述方法包括:
步骤201:获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置;其中,所述行进图像通过采集可行进设备的行进环境而得;
步骤202:将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
步骤203:基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
步骤204:获得所述可行进设备在所述行进环境中的第二实际位置;
步骤205:基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述可行进设备到达所述目标点的路线。
执行步骤201~205的实体为可行进设备。作为一种实现方式,所述方法还包括步骤206:控制所述可行进设备按照所述路线向所述目标点行进。
其中,步骤202/203和步骤204无严格的先后顺序,还可以同时进行。
由步骤201~205可知,基于目标点在行进图像中的位置,得到目标点在基于行进环境而建立的局部地图中的位置,得到目标点在真实行进环境中的(第一)实际位置,依据目标点在真实行进环境中的实际位置和可行进设备在真实行进环境中的(第二)实际位置,由此确定从可行进设备至目标点的路线。这种基于目标点在行进图像、局部地图的位置得到目标点在真实行进环境中的位置的方案,基于行进环境(局部环境)而建立的局部地图而实现,为一种基于局部地图对目标点位置的定位的方案,与使用全局地图进行路线确定的方案相比,本申请实施例简单、易行,工程上更易于实现,更适用于诸如机器人、平衡车、滑板车、平衡轮等这种小型出行设备中。可避免采用全局地图进行路线确定而带来的地图资源浪费、对可行进设备的软硬件资源要求高的问题。不需要提前建图、建点,导航效率高。
本领域技术人员可以理解,前述的获得目标点的第一信息的方案,可以可行进设备在采集的行进图像中进行目标点的选取,也可以可行进设备将采集的行进图像发送、如发送至对端设备如远程服务器(简称为服务器);服务器在行进图像中进行目标点的选取,并向可行进设备进行选取结果的反馈,也即可行进设备通过接收目标点的第一信息而获得在行进图像目标点所处的位置。通俗地讲,本申请实施例中可行进设备依靠自身可完成可 行进设备与目标点之间的路线确定,还可以通过与其它设备如服务器的交互完成可行进设备与目标点之间的路线确定。其中,在通过与服务器的交互完成可行进设备与目标点之间的路线确定的方案中,至少需要服务器执行通过行进图像进行目标点的选取。这种目标点由服务器进行选取,目标点与可行进设备之间的路线由可行进设备来确定的方案,为一种通过交互进行路线确定的新型方案,使得可行进设备和服务器之间的路线确定方案更为新颖。在实际应用中,服务器侧通常存在有维护人员(操作人员),操作人员可基于对服务器显示的行进图像进行目标点的选取,可进一步保障目标点选取的准确性,进而可为路线确定的准确性提供一定的保障。
在前述实施例中,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所述行进图像的区域;所述目标点位于所述第一区域内。进一步的,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。可以理解,本申请实施例中,将行进环境中机器人能够采集到行进图像的区域称之为可视区,为方便对目标点的选取,在行进图像中对可视区进行显示以从可视区中进行目标点的选取,保证目标点选取的准确性,也即所选取的目标点可以为处于可视区内的任意一个点。进一步的,在实际应用中,可视区包括路况条件好的区域和路况条件差的区域,本申请实施例中,可基于行进图像对可视区内的路况条件进行分析,确认哪些为路况条件好的区域、哪些为路况条件差的区域。并在行进图像中将这两种路况条件的区域进行各自表示,所选取的目标点可以位于路况条件好的区域,也可以为路况条件差的区域,视具体的情况而灵活设定。其中,所述路况条件符合预定条件可以为可视区内路况条件好的区域,也可为可视区内路况条件差的区域。这种情况下,所述步骤105或206即可为:依据目标点所处的可视区的路况条件,控制可行进设备按 照所述路线向目标点行进。例如,如果目标点处于可视区内路况条件好的区域,则可以适当对可行进设备进行加速行驶,以尽快到达目的地(目标点)。如果目标点处于可视区内路况条件差的区域,则可以适当的减速行驶,以保证行驶安全。
下面结合附图3-附图8及具体实施例对本申请实施例作进一步详细的说明。
在图3-图8所示中,以可行进设备为机器人为例,通过机器人与服务器之间的交互来实现机器人与目标点之间的路线的确定。
本领域技术人员可以理解,相关技术中如果机器人采用全局地图进行路线确定,全局地图通常需要提前建立,建图工作量较大,占用的运算资源多,不适合诸如机器人、平衡车、平衡轮等这种相对小型的出行设备。此外,相关技术中的基于非全局定位的导航方案也需要提前建图,导航效率低,在实现上具有一定难度。可以理解,相关技术中,还存在以下几个问题:第一,服务器可通过遥控的方式对机器人进行操作,但这种控制方式效率不高。第二,机器人也可以不通过服务器进行控制,可进行自主导航,对于这种导航往往会可能会存在遇到无法处理的情况。本申请实施例的以下方案至少能够解决相关技术中存在的以上问题。
本申请实施例中,预先在机器人的一固定位置处如头部的上方设置用于采集机器人所处行进环境的采集装置,该采集装置可以是任何类型的相机,如鱼眼相机、深度相机、视觉相机等。在机器人行进过程中,机器人通过采集装置如鱼眼相机可实时对行进环境进行采集。可以理解,通过采集装置采集到的行进环境是位于采集装置采集角度内的环境,相当于机器人可视区域内的行进环境。根据采集到的机器人可视区域内的行进环境,机器人自身可得到可视区域内各个物体相对于机器人的方向和距离等信 息,从而可建立如图7所示的地图。该地图由于是机器人根据行进环境建立,机器人的行进环境为局部环境,基于局部环境建立的地图称之为局部地图,该局部地图所在的坐标系为机器人坐标系。可以理解,该局部地图可以是将当前机器人所处的真实行进环境中各个物体与机器人之间的实际位置关系经过一定的缩放如缩小进行表示、且将真实行进环境中位于可视区域内的各个物体与机器人之间的距离和方向关系通过二维坐标进行表示。此外,机器人还需要建立针对机器人当前行进环境的物理坐标系,将真实的当前行进环境中可视区域内的各个物体和可行进设备之间的距离和方向在物理坐标系中表示出来,可以理解,本申请实施例的物理坐标系表示的在机器人的当前行进环境下可视区域内的每个物体的绝对坐标,与相关技术中的全局地图相比,其基于当前行进环境而建立,考虑到当前行进环境为一种局部环境,所以基于局部环境而建立的物理坐标系的表示内容也可视为一种局部地图。本申请实施例的物理坐标系中当前行进环境下的各个物体与机器人之间的距离和方向关系与如图7所示的地图中的对应物体与机器人之间的距离和方向具有一定的映射关系,如物体A在本申请实施例的物理坐标系中与机器人之间相距10m、位于北向,缩放比例为1000:1(真实环境中的10m在地图中用1cm来表示),则在机器人建立的局部地图中表示物体A的坐标点位于表示机器人的坐标点的前方、与表示机器人的坐标点之间的距离为1cm。可以理解,在机器人坐标系中表示的内容是在当前机器人所处的行进环境下可视区域内各个物体相对于机器人的位置关系。可以理解,本申请实施例中在机器人侧,需要建立二个局部地图,其中一个是在机器人坐标系下的地图,其表示的是当前行进环境下可视区内的各个物体与机器人之间的相对位置关系。另一个是在物理坐标系下的地图,其表示的是当前行进环境下可视区内的各个物体(包括机器人)在当前真实环境下的位置。
机器人通过采集装置如鱼眼相机对可视区域内的行进环境进行采集,得到行进图像,发送行进图像至服务器。服务器接收并显示行进图像(如图3所示)。可以理解,为方便操作人员从服务器显示的行进图像中进行目标点的选取,基于行进图像服务器建立如图8所示的坐标系,可使得服务器获知操作人员选取的目标点在行进图像中的位置。为区别于前述的机器人坐标系,称服务器建立的坐标系为图像坐标系。可以理解,图像坐标系是通过行进图像将当前行进环境中可视区域内的各个物体与机器人的位置关系(该位置关系也是对真实的位置关系的缩小)进行表示。
操作人员可通过如图3所示的行进图像对机器人所处的行进环境进行观看,并在行进图像中选取一目标点作为需要机器人行进的目的地。可以理解,服务器接收到的行进图像是位于机器人可视区域内的图像,该可视区域的范围通常受采集装置在机器人上的设置位置、高度、广角等多个元素而影响,具体请参见相关说明。此外,可视区域还受机器人本身的运动特性而影响。具体的,在行进环境内,由于机器人本身的运动特性,对于机器人来说其部分动作、至少是部分动作的角度或方向会受到一定的限制。例如,对于机器人来说,其进行左转、右转的角度将会受到限制,至少无法做到左转90°、85°或右转90°、85°,而对于左转30°或45°其不会受到限制。由于其以上运动特性的存在导致在其在行进环境内存在有其无法到达的位置如无法到达左转90°的位置,也存在有可达到的位置如左转30°或45°到达的位置。基于此,为向操作人员清晰的表达在机器人当前所采集的行进图像内哪些位置机器人无法到达、哪些位置机器人可以到达,则机器人在发送行进图像至服务器的同时,将其能够到达的位置在行进图像中表示出来,以供操作人员通过该表示能够准确的选择机器人可到达的点作为目标点。经过发明人的一段研究发现,由于机器人本身的运动特性的存在使得其能够到达的位置通常表现为如图4和图5所示的扇形区 域或扇形变形的区域,该区域即为本申请实施例中的机器人的可视区域-机器人能够采集到行进图像的区域。服务器在显示行进图像的同时,在行进图像中将机器人当前的可视区表示出来,用于提示操作人员在可视区内进行目标点的选取。在图3所示的行进图像中通过鼠标或手指在物体B所处的位置产生鼠标操作或手势操作,选取物体B所在的位置为目标点,以完成操作人员想要机器人到达物体B所在的位置的期望。
可以理解,操作人员通过行进图像选取的物体B所在的位置为在图像坐标系中物体B所处的位置,要想获知在真实的行进环境中物体B所处的位置,至少还需要将物体B进行坐标变化。本申请实施例中,行进图像和机器人建立的地图均是表示行进环境中的各物体与机器人之间的位置关系,但是二者采用的坐标系不同,所以至少需要将物体B在图像坐标系中的位置坐标,转换到物体B在机器人坐标系下的坐标,再将机器人坐标系下的坐标映射到本申请实施例基于机器人当前所处的真实行进环境而建立的物理坐标系,得到物体B在真实行进环境中的实际位置。进一步的,以服务器的显示屏的像素为720p(像素单位)、长宽比为16:9为例,720p=720×1280,如图8所示,图像坐标系下x轴最大值为720,y轴最大值为1280,服务器检测操作人员在显示屏上所选取的目标点的坐标,如检测到物体B在行进图像中的所处位置在显示屏上的坐标为(700,900),坐标(700,900)为图像坐标系下的坐标表示。服务器将选取的目标点的在图像坐标系下的坐标进行转换,将该坐标转换到机器人坐标系下,机器人坐标系下该目标点的坐标为(700/720,900/1280),并将计算结果发送至机器人,机器人将该计算结果显示在机器人坐标系下的效果如图7所示。或者,服务器将选取的目标点的在图像坐标系下的坐标发送至机器人,机器人在获知服务器侧显示屏的显示像素(720×1280)的基础上进行如上换算,得到目标点在机器人坐标系下的坐标(700/720,900/1280)。机器人将得到的目标点 在机器人坐标系下的坐标映射到本申请实施例的物理坐标系中去,得到物体B在真实行进环境中所处的位置。机器人获知自身在真实的行进环境中所处的位置,规划从自身当前所处的位置到达物体B所处的位置的路线,通过自主导航的方式进行目的地的行进。假定依据物理坐标系表示的真实行进环境中物体B位于机器人东北方向距离500m的位置,则规划从目标点到机器人的路线,按照规划出的路线进行行进,如按照从目标点到机器人的直线距离最短的路线进行行进。其中,关于目标点从机器人坐标系下映射到物理坐标系下的过程可参见前述的映射关系而实现,不做具体赘述。
前述方案中,机器人还可以基于采集到的行进图像,对当可视区内的路况情况进行分析,将可视区内路况条件好的区域和差的区域进行区分,并标识在行进图像中,如图6所示,路况条件好的区域用灰色标识出,可视区内中除去灰色区域其它为路况条件差的区域。以提示操作人员其可以在路况条件好的区域进行目标点的选择,也可以在路况条件差的区域进行目标点的选择,优选为在路况条件好的区域进行目标点的选择。这样做的好处在于,一方面保证机器人行驶至目的地的安全性,不至于存在颠簸或震荡。另一方面,在后续机器人行驶至目的地的过程中,如果目标点位于可通过区域内路况条件好的区域,则可控制机器人加速行驶,以尽快到达目的地。如果目标点处于可通过区域内路况条件差的区域,则可以适当的减速行驶,以保证安全行驶。
本领域技术人员应该而知,如上方案可视为基于坐标转换得到目标点在真实行进环境中的实际位置的方案。在实际应用中,还需要考虑到采集装置如鱼眼相机的畸变、设置高度、在机器人上的设置位置和采集角度等这些因素的影响以实现更准确的坐标转换。具体的消除这些因素的影响的过程请参见相关说明,此处不赘述。
前述方案中,机器人建立的地图基于其当前所处的行进环境而建立, 相对于全局地图中的全局环境而言,机器人建立的地图基于局部特征,进一步的局部环境的图像特征而建立、为一种局部地图,与建立全局地图相比,本申请实施例可随着机器人的边行进边针对行进的环境进行地图的建立,不需要提前建立,实时行进实时建立即可,且该地图为针对局部环境而建立的地图,为局部地图。简单易行、工程上更易于实现,更适用于机器人、平衡车等出行设备。其中,操作人员通过服务器接收的机器人采集的实时行进图像进行导航目的地的选取,为一种由服务器基于图像进行目的地选取的方案。将操作人员通过行进图像选取的目标点在图像坐标系中的位置最终得到了目标点在真实行进环境中的位置,可避免采用全局地图进行路线确定而带来的地图资源浪费、对可行进设备的软硬件资源要求高、提前建立地图建立站点工程量大的问题,也可实现高效率的导航。
前述方案中,机器人需要到达的目的地-目标点无需机器人选取,由服务器进行指定,具体的由服务器从机器人采集的行进图像进行选取。这种由服务器从图像中进行目的地选取的方案,与相关技术中的服务器一直遥控机器人才能到达目的地的方案相比,可在一定程度上解放操作人员、只需操作人员通过服务器选择好目标点即可,无需一直遥控机器人,可大大提升用户体验。这种由服务器进行目的地指定的方案,还可以使同一个操作人员在不同的时刻对不同机器人的目的地进行选择,方便同一个操作人员对多个机器人进行控制。其中,操作人员从可行进设备的可通过区域内进行目标点的选取,可进一步保障目标点选取的准确性,进而可为路线确定的准确性提供一定的保障。
在由服务器进行指定目的地的方案中,可行进设备可自主导航,行进至目的地处。在行进过程中,如果出现其无法处理的情况,如无法检测到前方红绿灯,本申请实施例中,机器人还可自动向服务器发送通知消息,以提示操作人员进行解决。此外,机器人还可以在规划的行进路线上进行 障碍物的检测并自动避障。体现了机器人的功能多样性。可以理解,本申请实施例的方案不基于全局地图,基于局部地图进行导航;还可以通过点击机器人实时传输的行进图像进行导航目的地的选取。本申请实施例不需要提前建立地图,根据实时的行进环境建立局部地图、采集行进图像即可。
本申请还提供一种可行进设备的实施例,如图9所示,所述设备包括:第一获得单元601、处理单元602、第二获得单元603和确定单元604;其中,
第一获得单元601,配置为获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置;其中,所述行进图像通过采集可行进设备的行进环境而得;
处理单元602,配置为将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
第二获得单元603,配置为基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
确定单元604,配置为至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
作为一种实现方式,所述第二获得单元603,还配置为:
获得所述可行进设备在所述行进环境中的第二实际位置;
相应的,所述确定单元604,配置为基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述路线。
作为一种实现方式,所述设备还包括发送单元,配置为将采集所述可行进设备的行进环境而得的所述行进图像发送至远程服务器;
相应的,所述第一获得单元601,配置为接收来自所述远程服务器的所 述目标点的第一信息。
作为一种实现方式,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所述行进图像的区域;所述目标点位于所述第一区域内。进一步的,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。
作为一种实现方式,所述设备还包括:行进单元,配置为控制所述可行进设备按照所述路线向所述目标点行进。
上述实施例提供的可行进设备与前述的路线确定方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。前述的第一获得单元601、处理单元602、第二获得单元603和确定单元604均可由数字信号处理(DSP)、中央处理器(CPU)、逻辑编程阵列(FPGA)、控制器(MCU)等来实现。
本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,其特征在于,该程序被处理器执行时至少用于执行图1至图8任一所示方法的步骤。所述计算机可读存储介质具体可以为存储器。所述存储器可以为如图10所示的存储器72。
本申请实施例还提供了一种可行进设备。图10为本申请实施例的可行进设备的硬件结构示意图,如图10所示,可行进设备包括:用于进行数据传输的通信组件73、至少一个处理器71和用于存储能够在处理器71上运行的计算机程序的存储器72。终端中的各个组件通过总线系统74耦合在一起。可理解,总线系统74用于实现这些组件之间的连接通信。总线系统74除包括数据总线之外,还包括电源总线、控制总线和状态信号总线。但是为了清楚说明起见,在图10中将各种总线都标为总线系统74。
其中,所述处理器71执行所述计算机程序时至少执行图1至图8任一 所示方法的步骤。
可以理解,存储器72可以是易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(ROM,Read Only Memory)、可编程只读存储器(PROM,Programmable Read-Only Memory)、可擦除可编程只读存储器(EPROM,Erasable Programmable Read-Only Memory)、电可擦除可编程只读存储器(EEPROM,Electrically Erasable Programmable Read-Only Memory)、磁性随机存取存储器(FRAM,ferromagnetic random access memory)、快闪存储器(Flash Memory)、磁表面存储器、光盘、或只读光盘(CD-ROM,Compact Disc Read-Only Memory);磁表面存储器可以是磁盘存储器或磁带存储器。易失性存储器可以是随机存取存储器(RAM,Random Access Memory),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(SRAM,Static Random Access Memory)、同步静态随机存取存储器(SSRAM,Synchronous Static Random Access Memory)、动态随机存取存储器(DRAM,Dynamic Random Access Memory)、同步动态随机存取存储器(SDRAM,Synchronous Dynamic Random Access Memory)、双倍数据速率同步动态随机存取存储器(DDRSDRAM,Double Data Rate Synchronous Dynamic Random Access Memory)、增强型同步动态随机存取存储器(ESDRAM,Enhanced Synchronous Dynamic Random Access Memory)、同步连接动态随机存取存储器(SLDRAM,SyncLink Dynamic Random Access Memory)、直接内存总线随机存取存储器(DRRAM,Direct Rambus Random Access Memory)。本申请实施例描述的存储器72旨在包括但不限于这些和任意其它适合类型的存储器。
上述本申请实施例揭示的方法可以应用于处理器71中,或者由处理器71实现。处理器71可能是一种集成电路芯片,具有信号的处理能力。在实 现过程中,上述方法的各步骤可以通过处理器71中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器71可以是通用处理器、DSP,或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。处理器71可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者任何常规的处理器等。结合本申请实施例所公开的方法的步骤,可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于存储介质中,该存储介质位于存储器72,处理器71读取存储器72中的信息,结合其硬件完成前述方法的步骤。
在示例性实施例中,可行进设备可以被一个或多个应用专用集成电路(ASIC,Application Specific Integrated Circuit)、DSP、可编程逻辑器件(PLD,Programmable Logic Device)、复杂可编程逻辑器件(CPLD,Complex Programmable Logic Device)、FPGA、通用处理器、控制器、MCU、微处理器(Microprocessor)、或其他电子元件实现,用于执行前述的路线确定方法。
在本申请所提供的几个实施例中,应该理解到,所揭露的设备和方法,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个处理单元中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤;而前述的存储介质包括:移动存储设备、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
或者,本申请上述集成的单元如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本申请各个实施例所述方法的全部或部分。而前述的存储介质包括:移动存储设备、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
本申请所提供的几个方法实施例中所揭露的方法,在不冲突的情况下可以任意组合,得到新的方法实施例。
本申请所提供的几个产品实施例中所揭露的特征,在不冲突的情况下可以任意组合,得到新的产品实施例。
本申请所提供的几个方法或设备实施例中所揭露的特征,在不冲突的情况下可以任意组合,得到新的方法实施例或设备实施例。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不 局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。
工业实用性
本申请实施例中的路线确定方案,利用目标点在行进图像中的位置和基于行进环境而建立的局部地图中的位置,确定目标点在真实行进环境中的实际位置,并由此确定在真实环境中能够从可行进设备到目标点的路线,为一种通过目标点在行进图像、局部地图中的位置而确定从可行进设备到目标点的路线的方案。这种基于局部地图对目标点位置的定位的方案至少可避免采用全局地图进行路线确定而带来的地图资源浪费、对可行进设备的软硬件资源要求高的问题。本申请实施例的局部定位方案、简单、易行,更适用于诸如机器人、平衡车、滑板车、平衡轮等这种小型出行设备中。

Claims (12)

  1. 一种路线确定方法,所述方法包括:
    获得目标点的第一信息,所述第一信息至少表征为在行进图像中所述目标点的位置,其中所述行进图像通过采集可行进设备的行进环境而得;
    将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
    基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
    至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
  2. 根据权利要求1所述的方法,其中,所述方法还包括:
    获得所述可行进设备在所述行进环境中的第二实际位置;
    相应的,所述至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线,包括:
    基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述路线。
  3. 根据权利要求1或2所述的方法,其中,所述方法还包括:
    将采集所述可行进设备的行进环境而得的所述行进图像发送至远程服务器;
    相应的,所述获得目标点的第一信息,包括:
    接收来自所述远程服务器的目标点的第一信息。
  4. 根据权利要求3所述的方法,其中,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所 述行进图像的区域;所述目标点位于所述第一区域内。
  5. 根据权利要求4所述的方法,其中,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。
  6. 一种可行进设备,包括:
    第一获得单元,配置为获得目标点的第一信息,所述第一信息至少表征为在可行进设备的行进图像中所述目标点的位置;其中,所述行进图像通过采集所述可行进设备的行进环境而得;
    处理单元,配置为将所述目标点的所述第一信息进行处理,得到所述目标点的第二信息,所述第二信息至少表征为在基于所述行进环境而建立的局部地图中所述目标点的位置;
    第二获得单元,配置为基于所述目标点的第二信息,得到所述目标点在所述行进环境中的第一实际位置;
    确定单元,配置为至少基于所述目标点在所述行进环境中的第一实际位置,确定所述可行进设备到达所述目标点的路线。
  7. 根据权利要求6所述的设备,其中,所述第二获得单元,还配置为:
    获得所述可行进设备在所述行进环境中的第二实际位置;
    相应的,所述确定单元,配置为基于所述目标点在所述行进环境中的第一实际位置和所述可行进设备在所述行进环境中的第二实际位置,确定所述路线。
  8. 根据权利要求6或7所述的设备,其中,所述设备还包括发送单元,配置为将采集所述可行进设备的行进环境而得的所述行进图像发送至远程服务器;
    相应的,所述第一获得单元,配置为接收来自所述远程服务器的所述目标点的第一信息。
  9. 根据权利要求8所述的设备,其中,所述行进图像至少显示有第一区域,所述第一区域表征为在所述行进环境中所述可行进设备可采集到所述行进图像的区域;所述目标点位于所述第一区域内。
  10. 根据权利要求9所述的设备,其中,所述行进图像至少显示有第二区域,所述第二区域表征为第一区域内路况条件符合预定条件的区域;所述目标点位于所述第二区域内。
  11. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现权利要求1至5任一所述方法的步骤。
  12. 一种可行进设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现权利要求1至5任一所述方法的步骤。
PCT/CN2020/109619 2019-08-15 2020-08-17 一种路线确定方法、可行进设备、和存储介质 WO2021027967A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910755755.5 2019-08-15
CN201910755755.5A CN110909585B (zh) 2019-08-15 2019-08-15 一种路线确定方法、可行进设备、和存储介质

Publications (1)

Publication Number Publication Date
WO2021027967A1 true WO2021027967A1 (zh) 2021-02-18

Family

ID=69814519

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/109619 WO2021027967A1 (zh) 2019-08-15 2020-08-17 一种路线确定方法、可行进设备、和存储介质

Country Status (2)

Country Link
CN (1) CN110909585B (zh)
WO (1) WO2021027967A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110481A (zh) * 2021-04-26 2021-07-13 上海智蕙林医疗科技有限公司 一种应急避让实现方法、系统、机器人和存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909585B (zh) * 2019-08-15 2022-09-06 纳恩博(常州)科技有限公司 一种路线确定方法、可行进设备、和存储介质
CN113867329A (zh) * 2020-06-12 2021-12-31 纳恩博(北京)科技有限公司 一种确定行进路线的方法、设备及存储介质
CN112340050A (zh) * 2020-10-30 2021-02-09 深圳中集天达空港设备有限公司 登机桥的远程控制方法、装置、介质及电子设备
CN114459494B (zh) * 2021-12-31 2024-03-26 北京百度网讯科技有限公司 可达区域的获取方法、装置、电子设备以及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107144285A (zh) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 位姿信息确定方法、装置和可移动设备
CN108594825A (zh) * 2018-05-31 2018-09-28 四川斐讯信息技术有限公司 基于深度相机的扫地机器人控制方法及系统
US20190061157A1 (en) * 2017-08-31 2019-02-28 Neato Robotics, Inc. Robotic virtual boundaries
CN109901590A (zh) * 2019-03-30 2019-06-18 珠海市一微半导体有限公司 桌面机器人的回充控制方法
CN110909585A (zh) * 2019-08-15 2020-03-24 北京致行慕远科技有限公司 一种路线确定方法、可行进设备、和存储介质

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008084135A (ja) * 2006-09-28 2008-04-10 Toshiba Corp 移動制御方法、移動ロボットおよび移動制御プログラム
KR100955483B1 (ko) * 2008-08-12 2010-04-30 삼성전자주식회사 3차원 격자 지도 작성 방법 및 이를 이용한 자동 주행 장치의 제어 방법
EP3144765B1 (en) * 2015-09-18 2020-01-08 Samsung Electronics Co., Ltd. Apparatus for localizing cleaning robot, cleaning robot, and controlling method of cleaning robot
CN106647742B (zh) * 2016-10-31 2019-09-20 纳恩博(北京)科技有限公司 移动路径规划方法及装置
CN106950985B (zh) * 2017-03-20 2020-07-03 成都通甲优博科技有限责任公司 一种自动送货方法及装置
CN106931961B (zh) * 2017-03-20 2020-06-23 成都通甲优博科技有限责任公司 一种自动导航方法及装置
CN107515606A (zh) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 机器人实现方法、控制方法及机器人、电子设备
CN208255717U (zh) * 2017-12-08 2018-12-18 灵动科技(北京)有限公司 物流机器人
CN108171796A (zh) * 2017-12-25 2018-06-15 燕山大学 一种基于三维点云的巡检机器人视觉系统及控制方法
WO2019126950A1 (zh) * 2017-12-25 2019-07-04 深圳前海达闼云端智能科技有限公司 一种定位方法、云端服务器、终端、系统、电子设备及计算机程序产品
CN108733081A (zh) * 2017-12-28 2018-11-02 北京猎户星空科技有限公司 一种储物设备
CN108469822B (zh) * 2018-04-04 2020-12-15 天津理工大学 一种室内导盲机器人在动态环境下的路径规划方法
CN109767452A (zh) * 2018-12-24 2019-05-17 深圳市道通智能航空技术有限公司 一种目标定位方法和装置、无人机
CN109571499A (zh) * 2018-12-25 2019-04-05 广州天高软件科技有限公司 一种智能导航引领机器人及其实现方法
CN109682381B (zh) * 2019-02-22 2020-09-25 山东大学 基于全向视觉的大视场场景感知方法、系统、介质及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107144285A (zh) * 2017-05-08 2017-09-08 深圳地平线机器人科技有限公司 位姿信息确定方法、装置和可移动设备
US20190061157A1 (en) * 2017-08-31 2019-02-28 Neato Robotics, Inc. Robotic virtual boundaries
CN108594825A (zh) * 2018-05-31 2018-09-28 四川斐讯信息技术有限公司 基于深度相机的扫地机器人控制方法及系统
CN109901590A (zh) * 2019-03-30 2019-06-18 珠海市一微半导体有限公司 桌面机器人的回充控制方法
CN110909585A (zh) * 2019-08-15 2020-03-24 北京致行慕远科技有限公司 一种路线确定方法、可行进设备、和存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113110481A (zh) * 2021-04-26 2021-07-13 上海智蕙林医疗科技有限公司 一种应急避让实现方法、系统、机器人和存储介质
CN113110481B (zh) * 2021-04-26 2024-02-06 上海智蕙林医疗科技有限公司 一种应急避让实现方法、系统、机器人和存储介质

Also Published As

Publication number Publication date
CN110909585B (zh) 2022-09-06
CN110909585A (zh) 2020-03-24

Similar Documents

Publication Publication Date Title
WO2021027967A1 (zh) 一种路线确定方法、可行进设备、和存储介质
CN111231950B (zh) 规划车辆变道路径的方法、装置、设备及可读存储介质
CN111311925B (zh) 车位的检测方法和装置、电子设备、车辆、存储介质
Ricks et al. Ecological displays for robot interaction: A new perspective
US11636764B2 (en) Vehicle-to-infrastructure cooperation information processing method, apparatus, device and autonomous vehicle
WO2019169348A1 (en) Visualization of high definition map data
EP3950235A1 (en) Self-propelled robot path planning method, self-propelled robot and storage medium
CN109459029B (zh) 一种用于确定目标对象的导航路线信息的方法与设备
CN111578839B (zh) 障碍物坐标处理方法、装置、电子设备及可读存储介质
CN107515002A (zh) 一种基于LiDAR和云计算实现机器人实时室内地图构建和定位导航的系统方法和装置
JP2023027233A (ja) 道路データ融合の地図生成方法、装置及び電子機器
Li et al. Depth camera based remote three-dimensional reconstruction using incremental point cloud compression
WO2021027966A1 (zh) 行进方法、可行进设备和存储介质
CN114608600A (zh) 一种自动驾驶系统的搭建方法及终端
CN111510857B (zh) 一种用于实现用户间协同移动的方法与设备
Zhang et al. Ntu4dradlm: 4d radar-centric multi-modal dataset for localization and mapping
CN113126120A (zh) 数据标注方法、装置、设备、存储介质以及计算机程序产品
JP7375149B2 (ja) 測位方法、測位装置、ビジュアルマップの生成方法およびその装置
CN110631586A (zh) 基于视觉slam的地图构建的方法、导航系统及装置
CN113378605A (zh) 多源信息融合方法及装置、电子设备和存储介质
CN113483771B (zh) 实景地图的生成方法、装置及系统
CN111897348A (zh) 云端机器人的控制方法及系统、云端机器人、云端服务器
CN111427331A (zh) 无人驾驶车辆的感知信息展示方法、装置和电子设备
US20240053746A1 (en) Display system, communications system, display control method, and program
CN111443700A (zh) 一种机器人及其导航控制方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20851428

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20851428

Country of ref document: EP

Kind code of ref document: A1