WO2021026965A1 - 自主移动设备 - Google Patents

自主移动设备 Download PDF

Info

Publication number
WO2021026965A1
WO2021026965A1 PCT/CN2019/102789 CN2019102789W WO2021026965A1 WO 2021026965 A1 WO2021026965 A1 WO 2021026965A1 CN 2019102789 W CN2019102789 W CN 2019102789W WO 2021026965 A1 WO2021026965 A1 WO 2021026965A1
Authority
WO
WIPO (PCT)
Prior art keywords
area array
array laser
laser sensor
view
autonomous mobile
Prior art date
Application number
PCT/CN2019/102789
Other languages
English (en)
French (fr)
Inventor
田美芹
谢凯旋
Original Assignee
科沃斯机器人股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 科沃斯机器人股份有限公司 filed Critical 科沃斯机器人股份有限公司
Priority to EP19941507.6A priority Critical patent/EP4011566A4/en
Publication of WO2021026965A1 publication Critical patent/WO2021026965A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L1/00Cleaning windows
    • A47L1/02Power-driven machines or devices
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • This application relates to the field of artificial intelligence technology, in particular to an autonomous mobile device.
  • autonomous mobile devices such as robots
  • autonomous mobile devices can perceive and interact with the external environment, which is the basis for autonomous mobile devices to move autonomously and perform tasks.
  • the existing autonomous mobile devices collect external environmental information through sensors, and perceive the external environment based on the external environmental information.
  • the existing autonomous mobile devices have low accuracy in the perception of the external environment and need to be further improved.
  • Various aspects of the present application provide an autonomous mobile device, which is used to improve the ability of the autonomous mobile device to perceive the external environment and improve the accuracy of the perception of the external environment.
  • the embodiment of the present application provides an autonomous mobile device, including: a device body and a first area array laser sensor installed on the device body; the first area array laser sensor is installed obliquely in the vertical field of view. Mentioned on the device body.
  • An embodiment of the present application also provides an autonomous mobile device, including: a device body and a first area array laser sensor and a second area array laser sensor installed on the device body; the first area array laser sensor is installed in the device On the front side of the device body, the second area array laser sensor is installed on the side surface of the device body.
  • the oblique installation method of the area array laser sensor is proposed, that is, the area array laser sensor is obliquely installed on the device body of the autonomous mobile device in the vertical field of view direction, which can reduce the observation of the non-rich information area. Increase the observation range of the rich information area, which is conducive to the collection of richer external environment information within the vertical field of view, improving the quality of the collected external environment information, and thereby improving the ability of autonomous mobile devices to perceive the external environment. Improve the accuracy of autonomous mobile devices' perception of the external environment.
  • FIG. 1 is a schematic diagram of the hardware structure of an autonomous mobile device provided by an exemplary embodiment of this application;
  • 2a is a schematic diagram of the relationship between the angular bisector and the horizontal line of the vertical field of view of the array laser sensor in an inclined installation mode and in a horizontal installation mode provided by an exemplary embodiment of the application;
  • 2b is a schematic diagram of the relationship between the angular bisector and the horizontal line of the vertical field of view of the array laser sensor in an inclined installation mode and in a horizontal installation mode according to an exemplary embodiment of the application;
  • FIG. 3 is a schematic diagram of the hardware structure of another autonomous mobile device provided by an exemplary embodiment of this application.
  • 4a is a schematic diagram of the hardware structure of another autonomous mobile device provided by an exemplary embodiment of this application.
  • 4b is a schematic diagram of the installation positions of two area array laser sensors on an autonomous mobile device according to an exemplary embodiment of the application;
  • Fig. 4c is a schematic diagram of the vertical field angles of two area array laser sensors intersecting at the farthest visual distance end according to an exemplary embodiment of the application;
  • FIG. 4d is a schematic diagram of two area array laser sensors provided by an exemplary embodiment of the present application with parallel vertical field of view boundaries;
  • Fig. 4e is a schematic diagram of the boundary of the vertical field angles of two area array laser sensors intersecting at a specified position according to an exemplary embodiment of the application.
  • an inclined installation method of the area array laser sensor is proposed, namely the area array laser
  • the sensor is installed obliquely on the device body of the autonomous mobile device in the direction of the vertical field of view, which can reduce the observation range of the non-rich information area and increase the observation range of the rich information area, which is conducive to the acquisition in the vertical field of view.
  • improve the quality of the collected external environment information improve the quality of the collected external environment information, thereby improving the ability of autonomous mobile devices to perceive the external environment, and improve the accuracy of autonomous mobile devices’ perception of the external environment.
  • FIG. 1 is a schematic diagram of the hardware structure of an autonomous mobile device provided by an exemplary embodiment of this application.
  • the autonomous mobile device 100 includes a device body 101 on which one or more processors 102 and one or more memories 103 are provided.
  • the one or more memories 103 are mainly used to store computer programs, which can be executed by one or more processors 102 to cause the one or more processors 102 to control the autonomous mobile device 100 to perform corresponding tasks.
  • the one or more memories 103 may also be configured to store various other data to support operations on the autonomous mobile device 100. Examples of these data include instructions for any application or method to operate on the autonomous mobile device 100, map data of the environment/scenario where the autonomous mobile device 100 is located, working mode, working parameters, and so on.
  • One or more memories 103 which can be implemented by any type of volatile or non-volatile storage devices or their combination, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM) , Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), Magnetic Memory, Flash Memory, Magnetic Disk or Optical Disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EPROM Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • ROM Read-Only Memory
  • Magnetic Memory Flash Memory
  • Flash Memory Magnetic Disk or Optical Disk.
  • the one or more processors 102 can be regarded as a control system of the autonomous mobile device 100, and can be used to execute one or more computer programs stored in the memory 103 to control the autonomous mobile device 100 to perform corresponding tasks.
  • the device body 101 is also provided with some basic components of the autonomous mobile device 100, such as a power supply component 104, a drive component 105, etc. .
  • the driving assembly 105 may include a driving wheel, a driving motor, a universal wheel, and the like.
  • the autonomous mobile device 100 is not limited.
  • the autonomous mobile device 100 can be any mechanical device that can move in space with a high degree of autonomy in its environment, for example, it can be an unmanned vehicle or a drone. , Robots or air purifiers, etc.
  • the autonomous mobile device 100 may be various types of robots such as a cleaning robot and other service robots.
  • Sweeping robots refer to robots that can perform cleaning tasks autonomously in their working environment, including sweeping robots, window cleaning robots, etc.
  • Other service robots refer to robots that can move autonomously in their operating environment and provide non-cleaning services, including home escort robots, welcome robots, or warehouse handling robots.
  • the shape of the autonomous mobile device 100 will be different depending on the implementation form of the autonomous mobile device 100. This embodiment does not limit the implementation form of the autonomous mobile device 100.
  • the outer contour shape of the autonomous mobile device 100 may be an irregular shape or some regular shapes.
  • the outer contour of the cleaning robot may be a regular shape such as a circle, an ellipse, a square, or a triangle.
  • other than regular shapes are called irregular shapes.
  • the outer contours of humanoid robots, unmanned vehicles, and drones are irregular shapes.
  • the autonomous mobile device 100 of this embodiment further includes: a first area array laser sensor 106.
  • the first area array laser sensor 106 is installed on the device body 101 and mainly collects environmental information in the external environment where the autonomous mobile device 100 is located, and transmits the collected external environmental information to one or more processors 102.
  • One or more processors 102 can perceive the external environment according to the external environment information collected by the first area array laser sensor 106, and then perform various controls on the autonomous mobile device 100 so that the autonomous mobile device 100 can perform corresponding tasks or realize corresponding functions .
  • the first area array laser sensor 106 is an area array laser sensor.
  • the "area array laser sensor” is prefixed with “first”.
  • the “first” here neither indicates the quantity nor the Priority. The following briefly introduces the structure, working principle and advantages of the area array laser sensor:
  • the area array laser sensor mainly includes a laser emitting array and an information acquisition module.
  • the information acquisition module can collect environmental images and can also receive the reflected light from the laser hitting the object.
  • the information collection module may include components such as a camera.
  • the working principle of the area array laser sensor is: the laser emitting array emits the light source through the optical imaging system in front of it. After the emitted light source reaches the surface of the object, a part of it is reflected back and formed on the image by the optical imaging system in front of the information acquisition module Of pixels.
  • the reflected light time of flight TOF
  • each pixel can obtain independent distance information, and its detection range can reach more than 100 meters.
  • the information collection module of the area array laser sensor can also collect images of the surrounding environment to achieve fast 3D imaging with a resolution of megapixels, and the imaging frequency is above 30 frames per second.
  • the environmental information collected by the area array laser sensor not only includes direction and distance information, but also adds reflectivity information on the surface of the object, supplemented by deep learning technology in a three-dimensional scene, to realize the cognitive ability of environmental elements.
  • the data composed of reflectance information can be regarded as a kind of texture information, from which environmental features with matching and recognition value can be obtained, and it has strong environmental recognition ability, to a certain extent
  • the area array laser sensor combines the advantages of the line laser sensor and the visual sensor, which not only helps to improve the spatial understanding of the environment of autonomous mobile devices, but also helps autonomous mobile devices to obtain quality in obstacle recognition performance.
  • the area array laser sensor can provide more accurate distance and direction information, which can reduce the complexity of perception calculations , Improve real-time performance.
  • the area array laser sensor in addition to the above advantages, it also has obvious advantages in the following aspects: 1)
  • the area array laser sensor has the advantages of solid-state, low-cost and miniaturization; 2) the area array The laser sensor does not require rotating parts when installed and used, which can greatly compress the structure and size of the sensor, increase the service life, and reduce the cost; 3)
  • the viewing angle of the area array laser sensor can be adjusted, and it can be adapted to different autonomous mobile devices, which is beneficial to speed up Scanning speed and accuracy; 4)
  • the area array laser sensor can collect environmental information in both horizontal and vertical directions at the same time, and can build a 3D map, which is beneficial to improve the accuracy of map-based positioning and navigation planning.
  • autonomous mobile devices can be controlled to implement various functions based on environmental perception. For example, it can realize the functions of object recognition, tracking and classification on the visual algorithm; in addition, based on the high precision of laser ranging, it can also realize the functions of real-time, robust, high-precision positioning and map construction, and then It can also provide comprehensive support for motion planning, route navigation, positioning, etc. based on the constructed high-precision environmental map.
  • the area array laser sensor has a certain field of view in both the horizontal and vertical directions, referred to as horizontal field of view and vertical field of view for short.
  • the horizontal field of view refers to the effective range of the area array laser sensor that can collect information in the horizontal direction
  • the vertical field of view angle refers to the effective range of the area array laser sensor that can collect information in the vertical direction.
  • horizontal field angle determines the information collection range of the area array laser sensor in the horizontal direction
  • its vertical field angle determines the area array laser sensor’s vertical direction
  • Information collection range The combination of the horizontal field of view and the vertical field of view forms the effective spatial range of the area array laser sensor that can collect information.
  • some area array laser sensors have a horizontal field of view angle of 120 degrees and a vertical field angle of 10 degrees; other area array laser sensors have a horizontal field of view angle of 90 degrees and a vertical field of view angle of 9 degrees.
  • some area array laser sensors have a horizontal field of view angle of 90 degrees and a vertical field angle of 10 degrees; some area array laser sensors have a horizontal field of view angle of 85 degrees and a vertical field of view angle of 65 degrees; some area arrays
  • the horizontal field of view of the laser sensor is 80 degrees, and the vertical field of view is 70 degrees; etc.
  • the size of the horizontal field of view and the vertical field of view of the area array laser sensor can also be customized according to application requirements. Regardless of whether the horizontal field angle and the vertical field angle of the area array laser sensor are customized, it is applicable to the embodiments of the present application.
  • the horizontal field of view angle of the first area array laser sensor 106 needs to meet the viewing angle requirements of the autonomous mobile device 100 in the horizontal direction during normal operation.
  • the vertical viewing angle of the first area array laser sensor 106 also needs to meet the viewing angle requirement of the autonomous mobile device 100 in the vertical direction during normal operation. If the number of the first area array laser sensor 106 is one, its horizontal field of view angle should be greater than or equal to the minimum viewing angle required by the autonomous mobile device 100 in the horizontal direction during normal operation, and its vertical field of view angle should be greater than or equal to the autonomous mobile device 100 The minimum viewing angle required for normal operation in the vertical direction.
  • the horizontal field of view angles of the multiple first area array laser sensors 106 should be combined to cover the viewing angle required by the autonomous mobile device 100 in the horizontal direction during normal operation;
  • the vertical field of view angle of the area array laser sensor 106 should cover the vertical viewing angle required by the autonomous mobile device 100 in normal operation.
  • the installation method of the first area array laser sensor 106 on the device body 101 will affect the collected external environment The quality of information.
  • an inclined installation method of the area array laser sensor is proposed.
  • the oblique installation method can also be called the pitch installation method or the non-horizontal installation method. It is an installation method in which the area array laser sensor is installed on the device body of the autonomous mobile device obliquely in the vertical field of view.
  • the horizontal installation method corresponding to the inclined installation method refers to the installation method in which the area array laser sensor is horizontally installed on the device body of the autonomous mobile device in the vertical field of view.
  • the angle between the bisector of the vertical field of view of the area array laser sensor and the horizontal line is 0, that is to say, the installation elevation angle of the area array laser sensor in the vertical direction is 0;
  • the angle between the angular bisector of the vertical field of view of the area array laser sensor and the horizontal line is not 0, that is to say, the installation elevation angle of the area array laser sensor in the vertical direction is not 0, that is, upward Or bow down.
  • FIG. 2a takes the sweeping robot as an example, showing the relationship between the angular bisector of the vertical field of view and the horizontal line of the lower array laser sensor in the inclined installation mode and the horizontal installation mode;
  • Figure 2b is the welcome robot As an example, a comparison diagram of the relationship between the angular bisector of the vertical field of view and the horizontal line of the lower array laser sensor in the inclined installation mode and the horizontal installation mode is shown.
  • the three dashed lines respectively represent the two boundaries and the angular bisector of the vertical field of view of the array laser sensor in the horizontal installation mode
  • the middle dotted line represents the vertical field of view angle of the array laser sensor in the horizontal installation mode.
  • the bisecting line also represents the horizontal line;
  • the three solid lines without arrows indicate the two boundaries and the angular bisecting line of the vertical field of view of the array laser sensor in the inclined installation mode;
  • the solid line without arrows in the middle indicates the inclined installation mode
  • the angle between the bisector of the vertical field of view of the array laser sensor (the middle dotted line in Figure 2a) and the horizontal line (the middle dotted line in Figure 2a) is 0, or In other words, the bisector of the vertical field of view of the area array laser sensor is parallel to the horizontal line.
  • Fig. 2a an example is shown for the coincidence of the angular bisector of the vertical field of view of the area array laser sensor with the horizontal line.
  • the angle bisector of the vertical field of view of the laser sensor in the oblique installation mode (the solid line without an arrow in the middle of Figure 2a) and the horizontal line (the dashed line in the middle of Figure 2a)
  • the angle a is not 0, in other words, the angular bisector of the vertical field of view of the area array laser sensor is not parallel to the horizontal line.
  • the information collection range of the array laser sensor above the horizontal line in the horizontal installation mode is the same as the information collection range of the area array laser sensor below the horizontal line; in Figure 2a, the lower array laser sensor in the inclined installation mode is The information collection range above the horizontal line is different from the information collection range of the area array laser sensor below the horizontal line.
  • the vertical field of view angle of the area array laser sensor is raised as an example.
  • the information collection range of the area array laser sensor above the horizontal line is larger than the information collection range of the area array laser sensor below the horizontal line, but it is not limited to this.
  • the information collection range of the array laser sensor above the horizontal line in the inclined installation mode is smaller than the information collection range of the area laser sensor below the horizontal line, that is, the vertical field of view of the area laser sensor in the inclined installation mode
  • the angle can also be lowered.
  • angle b1 represents the information collection range of the array laser sensor above the horizontal line in the inclined installation mode
  • angle b2 represents the information collection range of the array laser sensor below the horizontal line in the inclined installation mode.
  • the angle b1 is the angle between the upper boundary of the vertical field of view of the laser sensor in the oblique installation mode and the horizontal line;
  • the angle b2 is the lower boundary of the vertical field of view of the laser sensor in the oblique installation mode and the horizontal line The angle between.
  • the area array laser sensor in the inclined installation mode collects relatively less environmental information in the lower position. For example, the information on the legs of the stool near the ground is missing. However, it can collect more environmental information at a higher position, such as the stool surface information of the stool in the figure. Compared with the information of part of the stool legs close to the ground, the information of the stool surface is more abundant, which is more helpful for the autonomous mobile device 100 to perceive obstacles in the external environment.
  • the area array laser sensor in the inclined installation mode can collect more information on the travel path of the autonomous mobile device, which is higher than the altitude information of the autonomous mobile device More valuable. It can be seen that based on the oblique installation method, the vertical field of view of the area array laser sensor can cover a more reasonable observation range, which is conducive to collecting more abundant external environmental information within the vertical field of view.
  • the first area array laser sensor 106 is installed obliquely on the device body 101 of the autonomous mobile device 100 in the direction of the vertical field of view, and the angle between the bisector of the vertical field of view and the horizontal line is different. 0 (that is, not parallel), this can reduce the observation range of the first area array laser sensor 106 to the non-rich information area and increase its observation range of the rich information area, which is conducive to more abundant collection in the direction of the vertical field of view
  • the external environment information can improve the quality of the collected external environment information, thereby improving the ability of the autonomous mobile device 100 to perceive the external environment, and improve the accuracy of the autonomous mobile device 100’s perception of the external environment.
  • the inclined installation direction and the inclined installation angle of the first area array laser sensor 106 are not limited, and can be flexibly set according to factors such as application scenarios and the implementation form of the autonomous mobile device 100.
  • An example is given below in conjunction with a scenario-based embodiment:
  • the autonomous mobile device is a sweeping robot
  • the sweeping robot is equipped with an area array laser sensor
  • the area array laser sensor can collect environmental information in its horizontal field of view and vertical field of view.
  • the sweeping robot can identify the area to be cleaned based on the environmental information collected by the area array laser sensor.
  • the sweeping robot can construct an environmental map based on the environmental information collected by the area array laser sensor.
  • the sweeping robot can be repositioned according to the environmental information collected by the area array laser sensor.
  • the area array laser sensor is installed horizontally on the sweeping robot in the direction of the vertical field of view. Due to the low height of the sweeping robot, a large part of the area array laser sensor’s vertical field of view falls on the ground.
  • the horizontal installation method The observation range of the vertical field of view of the lower array laser sensor is shown by the upper and lower dashed lines in Figure 2a. This will lead to two problems: on the one hand, the ground information is relatively single, and the effective environmental information is less, which causes a waste of observable range to a certain extent; on the other hand, the incidence angle of the ground is larger, and the information on the ground The accuracy is relatively poor and is more affected by ground reflections or obstacles on the ground.
  • the area array laser sensor is installed on the sweeping robot obliquely in the vertical field of view direction. Specifically, the area array laser sensor is installed obliquely upward in the vertical field of view direction on the sweeping robot. On the robot.
  • the observation range of the area array laser sensor in the vertical field of view will be as shown by the three solid lines without arrows in Figure 2a, which can reduce the observable range of the non-rich information area (ground) and increase the richness.
  • the observable range of the information area (households or obstacles on the ground) is conducive to collecting more abundant external environmental information within the vertical field of view. It can also reduce the proportion of large incident angle data in the environmental information, and increase the collection
  • the quality of the external environment information can further improve the robot's perception of the external environment and improve the accuracy of the robot's perception of the external environment.
  • the cleaning robot can accurately and quickly move to the area to be cleaned based on its accurate perception of the external environment, which is beneficial to improve the cleaning efficiency.
  • the sweeping robot can more accurately identify obstacles and furniture in the environment, and then construct a high-precision three-dimensional (3D) environment map.
  • the angle between the angular bisector of the vertical field of view of the area array laser sensor and the horizontal line not only needs to be greater than 0, but also needs to be less than 1/2 of the vertical field of view, that is, the lower boundary of the vertical field of view cannot be high. ⁇ Horizontal.
  • the angle between the bisector of the vertical field of view of the area array laser sensor and the horizontal line is greater than or equal to 1/5 of the vertical field of view and less than or equal to 1/3 of the vertical field of view, which can be more accurate Collecting environmental information on the ground that is useful for the sweeping robot to perceive the external environment can greatly avoid the collection of large amounts of large incident angle data, reduce the proportion of large incident angle data, and improve the quality of the collected external environmental information.
  • the angle between the bisector of the vertical field of view of the area array laser sensor and the horizontal line is the vertical field of view of 4/15, but it is not limited to this.
  • the sweeping robot in the scenario-based embodiment 1 represents a type of autonomous mobile device with a relatively low height.
  • autonomous mobile devices with lower heights can also be window cleaning robots.
  • the device body of this autonomous mobile device is equipped with a first area array laser sensor; the first area array laser sensor is installed on the device body with an upward tilt in the vertical field of view. Further optionally, the angle between the angular bisector of the vertical field of view of the first area array laser sensor and the horizontal line is greater than 0 and less than 1/2 of the vertical field of view.
  • the angle between the bisector of the vertical field of view of the first area array laser sensor and the horizontal line is greater than or equal to 1/5 of the vertical field of view and less than or equal to 1/3 of the vertical field of view.
  • the angle between the bisector of the vertical field of view of the first area array laser sensor and the horizontal line is the vertical field of view of 4/15.
  • the autonomous mobile device is a welcome robot
  • an area array laser sensor is installed on the welcome robot
  • the area array laser sensor can collect environmental information in its horizontal field of view and vertical field of view.
  • the welcome robot can identify users or customers who need to be received based on the environmental information collected by the area array laser sensor.
  • the area array laser sensor is horizontally installed on the chest of the welcome robot in the direction of the vertical field of view. Due to the high height of the welcome robot, a large part of the vertical field of view of the area array laser sensor falls above the welcome robot.
  • the high-altitude position of the head of the guest robot and the observation range of the vertical field of view of the lower array laser sensor in the horizontal installation mode are shown in the upper and lower dotted lines in Figure 2b.
  • the area array laser sensor is installed on the welcome robot obliquely in the vertical field of view direction. Specifically, the area array laser sensor is installed obliquely downward in the vertical field of view direction. On the welcome robot.
  • the observation range of the array laser sensor in the vertical field of view will be as shown by the three solid lines without arrows in Figure 2b, which can reduce the observable area of the non-rich information area (high altitude area)
  • the rich information area the area diagonally below the welcome robot
  • it can observe obstacles in the area diagonally below the welcome robot (such as the stool in Figure 2b), which is conducive to collecting in the vertical field of view. Richer external environment information improves the quality of the collected external environment information, thereby improving the welcoming robot's ability to perceive the external environment and improving the accuracy of the welcoming robot's perception of the external environment.
  • the angle between the angular bisector of the vertical field of view of the area array laser sensor and the horizontal line not only needs to be greater than 0, but also needs to be less than 1/2 of the vertical field of view, that is, the upper boundary of the vertical field of view cannot be lower. ⁇ Horizontal.
  • the welcome robot in the scenario-based embodiment 2 represents a type of autonomous mobile device with a certain height (for example, relatively high).
  • autonomous mobile devices with higher heights can also be air purifiers, warehouse handling robots, or home escort robots.
  • a first area array laser sensor is installed on the device body of this autonomous mobile device; the first area array laser sensor is installed on the device body obliquely downward in the direction of the vertical field of view. Further optionally, the angle a between the angular bisector of the vertical field of view of the first area array laser sensor and the horizontal line is greater than 0 and less than 1/2 of the vertical field of view.
  • the angle a between the angular bisector of the vertical field of view of the first area array laser sensor and the horizontal line is greater than or equal to 1/5 of the vertical field of view and less than or equal to 1/3 of the vertical field of view.
  • the angle a between the angular bisector of the vertical field of view of the first area array laser sensor and the horizontal line is 4/15 of the vertical field of view.
  • the number of the first area array laser sensor 106 is not limited, and it may be one or multiple. Multiple refers to two or more. In practical applications, there are some simple application requirements. Only a single area array laser sensor can solve the problem of environmental perception. For autonomous mobile devices 100 working in these environments, a first area array laser can be installed. Sensor 106. In addition, there are also some complex application requirements. Multiple area array laser sensors are needed to solve the environmental perception problem. For autonomous mobile devices 100 working in these environments, multiple first area array laser sensors 106 may be provided.
  • the first area array laser sensor 106 can be installed on the front side of the device body 101, which is the direction the device body 101 faces when the autonomous mobile device 100 moves forward.
  • the autonomous mobile device may move forward or backward (referred to as backing for short) during the movement.
  • "Forward” here can be understood as: the direction of movement of the autonomous mobile device frequently or in most cases during operation.
  • the first area array laser sensor 106 is arranged on the front side of the device body 101 of the autonomous mobile device 100, which can more conveniently and accurately collect the front environmental information during the movement of the autonomous mobile device 100, so that the autonomous mobile device 100 is moving During the process, obstacle avoidance was more accurate, and the progress was smooth.
  • the first area array laser sensor 106 can also be installed on the back or side of the device body 101, which can be flexibly determined according to application requirements.
  • the rear side is relative to the front side
  • the side surface is a location area on the device body 101 between the front side and the rear side.
  • the autonomous mobile device 100 it generally has a certain height. This involves the issue of where the first area array laser sensor 106 is installed in the height direction of the device body 101 (that is, the installation height).
  • the installation height of the first area array laser sensor 106 on the device body 101 of the autonomous mobile device 100 is not limited.
  • the first area array laser sensor 106 can be flexibly selected according to application requirements and the height of the autonomous mobile device 100.
  • the installation height of the device body 101 For example, the first area array laser sensor 106 can be installed at the top, bottom or middle of the front side of the device body 101.
  • the multiple first area array laser sensors 106 may be installed at different positions of the device body 101.
  • multiple first area array laser sensors 106 can be installed around the device body 101.
  • the use of multiple first area array laser sensors 106 is beneficial to increase the observation range of the autonomous mobile device 100 in the horizontal direction.
  • the installation positions of the multiple first area array laser sensors 106 on the device body 101 are different.
  • the multiple first area array laser sensors 106 are installed in the height direction of the device body 101. Which position (ie installation height) is a problem. This embodiment does not limit this, and the installation height of the multiple first area array laser sensors 106 on the device body 101 can be flexibly selected according to application requirements and the height of the autonomous mobile device 100.
  • the installation height of part of the first area array laser sensor 106 on the device body 101 is the same, it can ensure that more abundant environmental information is collected at this height position.
  • the installation heights of the first area array laser sensors 106 on the device body 101 are all different, so that environmental information at different heights can be collected, and the richness of environmental information can be improved.
  • the installation heights of the first area array laser sensors 106 on the device body 101 are all the same, which can ensure that more abundant environmental information is collected at this height position.
  • the multiple first area array laser sensors 106 can meet the continuity requirement of the horizontal field of view.
  • the continuity requirement of the horizontal field of view angle may be that the horizontal field of view angles of the plurality of first area array laser sensors 106 are continuous, substantially continuous, or reach a set continuity.
  • continuity please refer to the subsequent embodiments, which will not be repeated here. Based on this, the installation positions of the multiple first area array laser sensors 106 on the device body 101 need to meet the continuity requirements of the horizontal field of view.
  • one of the first area array laser sensors 106 is installed on the front side of the device body 101, and the other first area array laser sensors 106 are sequentially installed in the device according to the continuity requirements of the horizontal field of view. Other positions of the body 101.
  • the horizontal field of view of two adjacent first area array laser sensors 106 may meet but not limited to any of the following requirements:
  • the horizontal field of view boundaries of two adjacent first area array laser sensors 106 are parallel; or
  • the boundaries of the horizontal field angles of two adjacent first area array laser sensors 106 intersect at a specified position; where the specified position is determined by the interference between the horizontal field angles of two adjacent first area array laser sensors 106 Decided.
  • the environmental information collected by two adjacent first area array laser sensors 106 in their respective horizontal field of view angles does not overlap and will not interfere with each other; and two adjacent first area arrays
  • the environmental information collected by the laser sensor 106 has continuity and has certain advantages in splicing and extracting environmental features.
  • first area array laser sensors 106 there are two first area array laser sensors 106. According to the range of the horizontal field of view of the two first area array laser sensors 106, one sensor can be installed on the front side of the device body 101. Installing another sensor on the side of the device body 101 can not only meet the requirement of continuous horizontal field of view, but also avoid mutual interference between the two sensors.
  • the lower boundary of the vertical field of view of the first area array laser sensor 106 generally intersects the bearing surface where the autonomous mobile device is located.
  • the position where the lower boundary of the vertical field of view of the first area array laser sensor 106 intersects with the bearing surface where the autonomous mobile device is located is recorded as the first intersection position.
  • the first distance threshold can be preset, and the distance from the first intersection location to the autonomous mobile device is required to be greater than the set first distance threshold, as shown in Figure 2a and Figure 2b, the distance L1 .
  • the specific value of the first distance threshold is not limited, and can be flexibly set according to factors such as application requirements, the height of the autonomous mobile device, and the vertical field of view of the first area array laser sensor.
  • the range of the first distance threshold may be 60-100 cm, such as 60 cm, 70 cm, 80 cm, 90 cm, etc., but it is not limited thereto.
  • the tilt angle of the first area array laser sensor in the vertical field of view direction, the installation height of the first area array laser sensor on the autonomous mobile device, and the first distance threshold are Among these three pieces of information, as long as any two pieces of information are determined, the third piece of information can be calculated.
  • the installation height of the first area array laser sensor on the autonomous mobile device can be calculated according to the tilt angle of the first area array laser sensor in the vertical field of view and the first distance threshold;
  • the installation height of the area array laser sensor on the autonomous mobile device and the first distance threshold are used to calculate the tilt angle of the first area array laser sensor in the direction of the vertical field of view.
  • the bearing surface where the autonomous mobile device is located may have different implementations.
  • the embodiment of the present application does not limit the specific implementation of the bearing surface, and any plane or non-planar surface that can bear autonomous mobile devices can be used as the bearing surface in the embodiment of the present application.
  • the ground is the bearing surface where the autonomous mobile device is located.
  • the desktop is the bearing surface where the autonomous mobile device is located.
  • the floor of the carriage is the bearing surface where the autonomous mobile device is located.
  • the roof is the bearing surface where the autonomous mobile device is located.
  • the description about the bearing surface here is also applicable to other embodiments of the present application.
  • the autonomous mobile device 100 further includes a second area array laser sensor 107 installed on the device body 101.
  • the second area array laser sensor 107 belongs to an area array laser sensor.
  • the structure, working principle, and advantages of the area array laser sensor can be referred to the foregoing embodiments, which will not be repeated here.
  • the difference between the second area array laser sensor 107 and the first area array laser sensor 106 is that the second area array laser sensor 107 is horizontally mounted on the device body 101 in the vertical field of view direction, that is to say The angle between the angular bisector of the vertical field of view of the second area array laser sensor 107 and the horizontal line is 0, or in other words, the angular bisector of the vertical field of view of the second area array laser sensor 107 is parallel to the horizontal line.
  • the number of the second area array laser sensor 107 is not limited, and it may be one or multiple.
  • the installation position and installation height of the first area array laser sensor 106 and the second area array laser sensor 107 on the device body 101 are not limited.
  • the installation heights of the first area array laser sensor 106 and the second area array laser sensor 107 on the device body 101 are the same.
  • all the first area array laser sensors 106 have the same installation height on the device body 101
  • all the second area array laser sensors 107 have the same installation height on the device body 101, but the first area laser sensors 106 are installed on the device body 101.
  • the installation height of the upper surface is different from the installation height of the second area array laser sensor 107 on the device body 101.
  • the first area array laser sensor 106 and the second area array laser sensor 107 may be evenly distributed on the device body 101.
  • the first area array laser sensor 106 and the second area array laser sensor 107 may be installed on the device body 101 at intervals.
  • the first area array laser sensor 106 is installed on the front side of the device body 101
  • the second area array laser sensor 107 is installed on the rear side of the device body 101.
  • the first area array laser sensor 106 is installed on the front side of the device body 101
  • the second area array laser sensor 107 is installed on the side surface of the device body 101.
  • the second area array laser sensor 107 is installed on the front side of the device body 101
  • the first area array laser sensor 106 is installed on the side surface of the device body 101.
  • the autonomous mobile device 101 includes a plurality of area array laser sensors. Some area array laser sensors are installed on the device body 101 in an inclined manner, and some area array laser sensors are installed on the device body in a horizontal installation manner. 101 on.
  • the area array laser sensor installed on the device body 101 in an oblique installation method is called the first area array laser sensor; the area array laser sensor installed in the device body 101 in a horizontal installation method is called the first area laser sensor.
  • Two-area array laser sensor are combined to more comprehensively collect environmental information in the vertical direction, and further improve the environmental perception ability of the autonomous mobile device 101.
  • the lower boundary of the vertical field of view of the second area array laser sensor 107 generally intersects the bearing surface where the autonomous mobile device is located.
  • the position where the lower boundary of the vertical field of view of the second area array laser sensor 107 intersects with the bearing surface where the autonomous mobile device is located is recorded as the second intersection position.
  • the second distance threshold may be preset, and the distance from the second intersection location to the autonomous mobile device is required to be greater than the set second distance threshold.
  • the specific value of the second distance threshold is not limited, and can be flexibly set according to factors such as application requirements, the height of the autonomous mobile device, and the size of the vertical field of view of the second area array laser sensor.
  • the range of the second distance threshold may be 50-90 cm, such as 50 cm, 60 cm, 70 cm, 80 cm, etc., but is not limited thereto. It should be noted that the values of the second distance threshold and the first distance threshold may be the same or different.
  • the first distance threshold is greater than the second distance threshold; if the first area array laser sensor is vertically If the viewing angle is installed on the device body obliquely downward, the first distance threshold is smaller than the second distance threshold.
  • the second distance threshold when the second distance threshold is determined, according to the requirement that the distance from the second intersection position to the autonomous mobile device is greater than the second distance threshold, it can be determined to a certain extent that the second area array laser sensor is in the device body.
  • Fig. 4a is a schematic diagram of the hardware structure of yet another autonomous mobile device provided by an exemplary embodiment of this application.
  • the autonomous mobile device 400 includes a device body 401 on which one or more processors 402 and one or more memories 403 are provided.
  • the one or more memories 403 are mainly used to store computer programs, which can be executed by one or more processors 402 to cause the one or more processors 402 to control the autonomous mobile device 400 to perform corresponding tasks.
  • the one or more memories 403 may also be configured to store various other data to support operations on the autonomous mobile device 400. Examples of such data include instructions for any application or method operating on the autonomous mobile device 400, map data of the environment/scenario where the autonomous mobile device 400 is located, working modes, working parameters, and so on.
  • the one or more processors 402 can be regarded as a control system of the autonomous mobile device 400, and can be used to execute computer instructions stored in one or more memories 403 to control the autonomous mobile device 400 to perform corresponding tasks.
  • the device body 401 is also provided with some basic components of the autonomous mobile device 400, such as a power supply component 404, a drive component 405, etc. .
  • the driving assembly 405 may include a driving wheel, a driving motor, a universal wheel, and the like.
  • the autonomous mobile device 400 is not limited.
  • the autonomous mobile device 400 can be any mechanical device that can move in space with a high degree of autonomy in its environment, such as unmanned vehicles or drones. , Robots or air purifiers, etc.
  • the autonomous mobile device 400 may be various types of robots such as a cleaning robot and other service robots.
  • Sweeping robots refer to robots that can perform cleaning tasks autonomously in their working environment, including sweeping robots, window cleaning robots, etc.
  • Other service robots refer to robots that can move autonomously in their operating environment and provide non-cleaning services, including home escort robots, welcome robots, or warehouse handling robots.
  • the shape of the autonomous mobile device 400 will be different depending on the implementation form of the autonomous mobile device 400. This embodiment does not limit the implementation form of the autonomous mobile device 400.
  • the outer contour shape of the autonomous mobile device 400 may be an irregular shape or some regular shapes.
  • the outer contour shape of the autonomous mobile device 400 may be a regular shape such as a circle, an oval, a square, a triangle, a drop shape, or a D shape. Other than regular shapes are called irregular shapes.
  • the outer contours of humanoid robots, unmanned vehicles, and drones are irregular shapes.
  • the autonomous mobile device 400 further includes a first area array laser sensor 406 and a second area array laser sensor 407.
  • the first area array laser sensor 406 and the second area array laser sensor 407 are installed on the device body 401, and mainly collect environmental information in the external environment where the autonomous mobile device 400 is located, and transmit the collected external environmental information to one or more One processor 402; one or more processors 402 can perceive the external environment according to the external environment information collected by the first area array laser sensor 406, and then perform various controls on the autonomous mobile device 400, so that the autonomous mobile device 400 executes the corresponding Task or realize the corresponding function.
  • the first area array laser sensor 106 and the second area array laser sensor 407 belong to area array laser sensors.
  • the "area array laser sensor” is prefixed with “first” and “second”.
  • the “first” and “second” here do not mean quantity or sequence.
  • two array laser sensors are used, which can make up for the deficiency of a single area array laser sensor with relatively single observation data.
  • the first area array laser sensor 406 and the second area array laser sensor 407 have a certain range
  • the first area array laser sensor 406 and the second area array laser sensor 407 are horizontally
  • the viewing angles are as close as possible in the direction, so that the horizontal viewing angles of the two area array laser sensors meet the continuity requirements.
  • the continuity requirement of the horizontal field of view angle means that the horizontal observation range covered by the horizontal field of view angle of the two area array laser sensors is continuous or as continuous as possible, or reaches a certain degree of continuity.
  • the definition of continuity can be: in the case that the horizontal observation range covered by the horizontal field of view of two area array laser sensors is not continuous, the area of the uncovered area between the two horizontal observation ranges and the two levels The ratio of the sum of the area of the observation range. Reaching a certain degree of continuity requires that the degree of continuity be greater than the set continuity threshold.
  • the definition of continuity is not limited to this method.
  • the first area array laser sensor 406 is installed on the front side of the device body 401
  • the second area array laser sensor 407 is installed on the side surface of the device body 401.
  • the laser sensor 406 and the second area array laser sensor 407 are installed symmetrically on the front and rear sides or the left and right sides of the device body 401.
  • the position between the two area array laser sensors is relatively closer, to a certain extent It can ensure that the horizontal observation range covered by the horizontal field of view of the two area array laser sensors meets the continuity requirement.
  • the front side and the side surface please refer to the foregoing embodiment, which will not be repeated here.
  • the side surface of the device body 401 is actually an area range between the front side and the back side of the device body 401.
  • the second area laser sensor 407 installed in the side area This is not limited, and the two horizontal viewing angles meet the continuity requirement.
  • the installation positions of the first area array laser sensor 406 and the second area array laser sensor 407 may be in a right-angle relationship, but it is not limited to this.
  • This front-side and side-side installation method can not only collect critical environmental information in front of the autonomous mobile device 400, and provide information advantages for the autonomous mobile device 400's environment perception, path planning, obstacle avoidance, etc., but also reduce
  • the repetition of the environmental information collected by the two area array laser sensors improves the richness of environmental information and solves the problem of insufficient information richness of a single area array laser sensor; furthermore, the first area array installed on the front side of the device body 401
  • the laser sensor 406 can also serve as a front buffer for the autonomous mobile device 400, reducing dependence on front buffer devices such as infrared buffers.
  • the second area array laser sensor 407 can be installed on the left side of the device body 401 or on the right side of the device body 401. If the autonomous mobile device 400 supports the edge mode, it can be combined with the edge mode supported by the autonomous mobile device 400 to determine whether the second area array laser sensor 407 is installed on the left or right side of the device body 401.
  • the second area array laser sensor 407 is installed on the left side of the device body 401; if the autonomous mobile device 400 supports the left edge mode, the second area array laser sensor 407 is installed on the device body 401 Right side.
  • the edge mode of the autonomous mobile device 400 considering the edge mode of the autonomous mobile device 400, select the side opposite to the edge mode supported by the autonomous mobile device 400, so that the autonomous mobile device 400 works in the supported edge mode.
  • the second area array laser sensor 407 is blocked by a relatively small angle, which is beneficial to provide the autonomous mobile device 400 with as much environmental information as possible.
  • the edge mode refers to a mode in which the autonomous mobile device 400 continues to perform tasks along the edge of the fixed object when it encounters a fixed object, such as a wall, a cabinet, or a wardrobe.
  • the left edge mode refers to a mode in which the left side of the autonomous mobile device 400 continues to perform tasks along the edge of a fixed object.
  • the right edge mode refers to a mode in which the right side of the autonomous mobile device 400 continues to perform tasks along the edge of a fixed object.
  • the first area array laser sensor 406 and the second area laser sensor 407 in the installation position, can be set according to the horizontal field of view range of the first area array laser sensor 406 and the second area array laser sensor 407.
  • the installation position of the sensor 407 is as close as possible to satisfy the continuity requirement between the horizontal field angles.
  • the installation positions of the first area array laser sensor 406 and the second area array laser sensor 407 are too close, the horizontal field angles of the two area array laser sensors overlap each other, if the overlap range of the two horizontal field angles Exceeding a certain degree, the two area array laser sensors will interfere with each other. Therefore, while keeping the installation positions of the first area array laser sensor 406 and the second area array laser sensor 407 as close as possible, it is also necessary to avoid the first area The array laser sensor 406 and the second area array laser sensor 407 interfere with each other.
  • one of the following methods can be used to determine whether the first area array laser sensor 406 and the second area laser sensor 407 are The installation location on the device body 401.
  • the horizontal field of view of the first area array laser sensor 406 and the second area array laser sensor 407 meets any of the following methods
  • Manner 1 The horizontal field angles of the first area array laser sensor 406 and the second area array laser sensor 407 intersect at the farthest visual distance end, as shown in FIG. 4c. In this way, the environmental information collected by the first area array laser sensor 406 and the second area array laser sensor 407 in their respective horizontal field angles does not overlap, and will not interfere with each other; and the horizontal view of the two area array laser sensors The intersection of the farthest visual distance end of the field angle makes the environmental information collected by the two area array laser sensors have continuity, and has certain advantages in splicing and extracting environmental features.
  • Manner 2 The boundaries of the horizontal field of view of the first area array laser sensor 406 and the second area array laser sensor 407 are parallel, as shown in FIG. 4d. In this way, the environmental information collected by the first area array laser sensor 406 and the second area array laser sensor 407 in their respective horizontal field angles does not overlap, and will not interfere with each other; and the horizontal view of the two area array laser sensors The field angles are parallel, so that the environmental information collected by the two area array laser sensors has continuity in the horizontal direction, and has certain advantages in splicing and extracting environmental features.
  • Manner 3 The boundary of the horizontal field of view of the first area array laser sensor 406 and the second area array laser sensor 407 intersect at a designated position, as shown in FIG. 4e.
  • the designated position is determined by the shortest observation distance that the first area array laser sensor 406 and the second area array laser sensor 407 need to meet.
  • the designated position in mode 3 may be other positions on the boundary line of the horizontal field of view except for the shortest viewing distance end. That is, as long as the horizontal field angles of the two area array laser sensors do not overlap within the shortest observation distance that the two area array laser sensors need to meet, the requirement that the two area array laser sensors cannot interfere with each other can be met.
  • the boundary of the two horizontal field angles intersects at the designated position, which can make the environmental information collected by the two area array laser sensors have continuity in the horizontal direction, which has certain advantages for the splicing and extraction of environmental features.
  • the installation manner of the first area array laser sensor 406 and the second area array laser sensor 407 in the direction of the vertical field of view is not limited.
  • the first area array laser sensor 406 and/or the second area array laser sensor 407 may be installed on the device body obliquely in the direction of the vertical field of view.
  • the first area array laser sensor 406 is obliquely installed on the device body in the vertical field of view direction, the first area array laser sensor 406 is installed obliquely upward in the vertical field of view direction on the device body, or the first area array The laser sensor 406 is installed on the device body obliquely downward in the vertical viewing angle direction, depending on the application requirements.
  • the second area array laser sensor 407 is installed on the device body obliquely in the vertical field of view direction, the second area array laser sensor 407 is installed obliquely upward in the vertical field of view direction on the device body, or the second area array The laser sensor 407 is installed on the device body obliquely downward in the direction of the vertical field of view, depending on the application requirements.
  • both area array laser sensors are installed obliquely upward in the vertical field of view direction.
  • both area array laser sensors are installed on the main body of the device obliquely downward in the vertical field of view direction, or one is installed on the main body of the device obliquely upward in the vertical field of view direction, and the other It is installed on the main body of the equipment obliquely downward in the direction of the vertical field of view.
  • the lower boundary of the vertical field of view of the first area array laser sensor 406 and the second area array laser sensor 407 generally intersects the bearing surface where the autonomous mobile device is located.
  • the position where the lower boundary of the vertical field of view of the first area array laser sensor 406 intersects with the bearing surface of the autonomous mobile device is recorded as the first intersection position; the vertical view of the second area array laser sensor 407 The position where the lower boundary of the field angle intersects with the bearing surface where the autonomous mobile device is located is recorded as the second intersection position.
  • the first distance threshold can be preset, and the distance from the first intersection location to the autonomous mobile device is required to be greater than the set first distance threshold; in the same way, the second distance threshold is preset, And it is required that the distance from the second intersection location to the autonomous mobile device is greater than the set second distance threshold.
  • the specific values of the first distance threshold and the second distance threshold are not limited.
  • the first distance threshold can be flexibly set according to application requirements, the height of the autonomous mobile device, and the size of the vertical field of view of the first area array laser sensor; similarly, the first distance threshold can be flexibly set according to application requirements, the height of the autonomous mobile device, and The second distance threshold can be flexibly set by factors such as the size of the vertical field of view of the second area array laser sensor.
  • the second distance threshold and the first distance threshold may be the same or different. If the first area array laser sensor and the second area array laser sensor are the same type of area array laser sensor, and the first area array laser sensor and the second area array laser sensor are installed horizontally in the direction of the vertical field of view, Then the values of the first distance threshold and the second distance threshold may be the same. If the first area array laser sensor and the second area array laser sensor are different types of area array laser sensors, the first distance threshold and the second distance threshold may take different values. Of course, when the first area array laser sensor and the second area array laser sensor are of the same type, if the two area array laser sensors are installed in different ways in the vertical field of view, the first distance threshold and the second distance threshold It can also take different values.
  • the first area array laser sensor and the second area array laser sensor are the same type of area array laser sensor
  • the first area array laser sensor A distance threshold is greater than the second distance threshold
  • the first distance threshold is less than the second distance threshold
  • a non-area array laser sensor or a non-area array laser sensor may also be installed or installed. It can assist the area array laser sensor to collect more abundant environmental information.
  • the non-area laser sensor may include one or any combination of ultrasonic sensors, infrared sensors, vision sensors, single-line laser sensors, and multi-line laser sensors. Combining the environmental information collected by various sensors can further improve the accuracy and precision of environmental perception, which is beneficial to further improve the accuracy of function control.
  • the embodiments of the present invention may be provided as methods, systems, or computer program products. Therefore, the present invention may adopt the form of a complete hardware embodiment, a complete software embodiment, or an embodiment combining software and hardware. Moreover, the present invention may be in the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) containing computer-usable program codes.
  • a computer-usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing functions specified in a flow or multiple flows in the flowchart and/or a block or multiple blocks in the block diagram.
  • the computing device includes one or more processors (CPU), input/output interfaces, network interfaces, and memory.
  • processors CPU
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-permanent memory in computer readable media, random access memory (RAM) and/or non-volatile memory, such as read-only memory (ROM) or flash memory (flash RAM). Memory is an example of computer readable media.
  • RAM random access memory
  • ROM read-only memory
  • flash RAM flash memory
  • Computer-readable media include permanent and non-permanent, removable and non-removable media, and information storage can be realized by any method or technology.
  • the information can be computer-readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical storage, Magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices. According to the definition in this article, computer-readable media does not include transitory media, such as modulated data signals and carrier waves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Fuzzy Systems (AREA)
  • Optics & Photonics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种自主移动设备(100),面阵激光传感器(106)在垂直视场方向上倾斜安装于自主移动设备(100)的设备本体(101)上。提出面阵激光传感器(106)的倾斜安装方式,可减少对非丰富信息区域的观测范围,增加对丰富信息区域的观测范围,有利于在垂直视场角内采集更丰富的外部环境信息,提高自主移动设备(100)对外部环境的感知能力和感知准确度。

Description

自主移动设备
交叉引用
本申请引用于2019年8月9日递交的名称为“自主移动设备”的第2019107358570号中国专利申请,其通过引用被全部并入本申请。
技术领域
本申请涉及人工智能技术领域,尤其涉及一种自主移动设备。
背景技术
随着人工智能技术的发展,对机器人等自主移动设备的研究逐步深入。其中,自主移动设备能够感知外部环境并可与外部环境进行交互,是自主移动设备能够自主移动并执行任务的基础。
现有自主移动设备大多是通过传感器采集外部环境信息,根据外部环境信息感知外部环境。但是,现有自主移动设备对外部环境的感知准确度较低,有待进一步改善。
发明内容
本申请的多个方面提供一种自主移动设备,用于提高自主移动设备对外部环境的感知能力,提高对外部环境的感知准确度。
本申请实施例提供一种自主移动设备,包括:设备本体和安装于所述设备本体上的第一面阵激光传感器;所述第一面阵激光传感器在垂直视场角方向上倾斜安装于所述设备本体上。
本申请实施例还提供一种自主移动设备,包括:设备本体和安装于所述设备本体上的第一面阵激光传感器和第二面阵激光传感器;所述第一面阵激光传感器安装于所述设备本体的前侧,所述第二面阵激光传感器安装 于所述设备本体的侧面。
在本申请实施例中,提出面阵激光传感器的倾斜安装方式,即面阵激光传感器在垂直视场角方向上倾斜安装于自主移动设备的设备本体上,这可减少对非丰富信息区域的观测范围,增加对丰富信息区域的观测范围,有利于在垂直视场角内采集更为丰富的外部环境信息,提高采集到的外部环境信息的质量,进而提高自主移动设备对外部环境的感知能力,提高自主移动设备对外部环境的感知准确度。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为本申请示例性实施例提供的一种自主移动设备的硬件结构示意图;
图2a为本申请示例性实施例提供的在倾斜安装方式下和在水平安装方式下面阵激光传感器的垂直视场角的角平分线与水平线之间的关系对比示意图;
图2b为本申请示例性实施例提供的在倾斜安装方式下和在水平安装方式下面阵激光传感器的垂直视场角的角平分线与水平线之间的关系对比示意图;
图3为本申请示例性实施例提供的另一种自主移动设备的硬件结构示意图;
图4a为本申请示例性实施例提供的又一种自主移动设备的硬件结构示意图;
图4b为本申请示例性实施例提供的两个面阵激光传感器在自主移动设备上的安装位置的示意图;
图4c为本申请示例性实施例提供的两个面阵激光传感器的垂直视场 角在最远视距端相交的示意图;
图4d为本申请示例性实施例提供的两个面阵激光传感器的垂直视场角的边界平行的示意图;
图4e为本申请示例性实施例提供的两个面阵激光传感器的垂直视场角的边界在指定位置处相交的示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请具体实施例及相应的附图对本申请技术方案进行清楚、完整地描述。显然,所描述的实施例仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
针对现有自主移动设备对外部环境的感知能力较差,对外部环境的感知准确度较低等技术问题,在本申请一些实施例中,提出面阵激光传感器的倾斜安装方式,即面阵激光传感器在垂直视场角方向上倾斜安装于自主移动设备的设备本体上,这可减少对非丰富信息区域的观测范围,增加对丰富信息区域的观测范围,有利于在垂直视场角内采集更为丰富的外部环境信息,提高采集到的外部环境信息的质量,进而提高自主移动设备对外部环境的感知能力,提高自主移动设备对外部环境的感知准确度。
以下结合附图,详细说明本申请各实施例提供的技术方案。
图1为本申请示例性实施例提供的一种自主移动设备的硬件结构示意图。如图1所示,该自主移动设备100包括:设备本体101,设备本体101上设置有一个或多个处理器102以及一个或多个存储器103。
一个或多个存储器103,主要用于存储计算机程序,该计算机程序可被一个或多个处理器102执行,致使一个或多个处理器102控制自主移动设备100执行相应任务。除了存储计算机程序之外,一个或多个存储器103还可被配置为存储其它各种数据以支持在自主移动设备100上的操作。这 些数据的示例包括用于在自主移动设备100上操作的任何应用程序或方法的指令,自主移动设备100所在环境/场景的地图数据,工作模式,工作参数等等。
一个或多个存储器103,可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(SRAM),电可擦除可编程只读存储器(EEPROM),可擦除可编程只读存储器(EPROM),可编程只读存储器(PROM),只读存储器(ROM),磁存储器,快闪存储器,磁盘或光盘。
一个或多个处理器102,可以看作是自主移动设备100的控制系统,可用于执行一个或多个存储器103中存储的计算机程序,以控制自主移动设备100执行相应任务。
进一步,设备本体101上除了设置有一个或多个处理器102以及一个或多个存储器103之外,还设置或安装有自主移动设备100的一些基本组件,例如电源组件104、驱动组件105等等。可选地,驱动组件105可以包括驱动轮、驱动电机、万向轮等。
在此说明,不同自主移动设备100所包含的这些基本组件以及基本组件的构成均会有所不同,本申请实施例仅是部分示例。另外,不同自主移动设备100所处的环境、需要执行的任务以及所能实现的功能等也会有所不同。
在本实施例中,并不对自主移动设备100进行限定,自主移动设备100可以是任何能够在其所处环境中高度自主地进行空间移动的机械设备,例如,可以是无人车、无人机、机器人或空气净化器等。其中,自主移动设备100可以是清扫型机器人、其它服务型机器人等各类机器人。清扫型机器人是指能够在其作业环境中自主执行清扫任务的机器人,包括扫地机器人、擦窗机器人等。其它服务型机器人是指能够在其作业环境中自主移动并提供非清扫服务的机器人,包括家庭陪护机器人、迎宾机器人或仓库搬运机器人等。
当然,根据自主移动设备100实现形态的不同,自主移动设备100的形状也会有所不同。本实施例并不限定自主移动设备100的实现形态。以自主移动设备100的外轮廓形状为例,自主移动设备100的外轮廓形状可以是不规则形状,也可以是一些规则形状。例如,以自主移动设备100是扫地机器人为例,扫地机器人的外轮廓可以是圆形、椭圆形、方形或三角形等规则形状。相应地,规则形状之外的称为不规则形状,例如人形机器人的外轮廓、无人车的外轮廓以及无人机的外轮廓等属于不规则形状。
无论是哪种形态的自主移动设备100,为了能够在其所处环境中自主移动,需要感知其所处的外部环境。为此,本实施例的自主移动设备100还包括:第一面阵激光传感器106。第一面阵激光传感器106安装于设备本体101上,主要采集自主移动设备100所处外部环境中的环境信息,并将采集到的外部环境信息传输给一个或多个处理器102。一个或多个处理器102可根据第一面阵激光传感器106采集到的外部环境信息感知外部环境,进而对自主移动设备100进行各种控制,以使自主移动设备100执行相应任务或实现相应功能。
其中,第一面阵激光传感器106属于面阵激光传感器,为了便于区分和描述,在“面阵激光传感器”前面冠以“第一”,这里的“第一”既不表示数量,也不代表先后顺序。下面对面阵激光传感器的结构、工作原理以及优势等进行简单介绍:
面阵激光传感器主要包括激光发射阵列和信息采集模块,信息采集模块可以采集环境图像,也可以接收激光打到物体上返回来的反射光。信息采集模块可以包含摄像头等组件。
面阵激光传感器的工作原理是:激光发射阵列经其前方的光学成像系统向外发射光源,发射出的光源在到达物体表面后,一部分反射回来并经信息采集模块前方的光学成像系统形成图像上的像素点。而由于物体表面到返回点的距离不同,其反射光飞行时间(TOF)不同,通过对反射光飞行时间的测量,每个像素点就可获得独立的距离信息,其探测范围可以达 到百米以上。另外,面阵激光传感器的信息采集模块还可以采集周围环境的图像,实现百万像素级别的分辨率的快速3D成像,及成像频率在每秒30帧以上。
面阵激光传感器采集到的环境信息不仅包含方向和距离信息,还加入了物体表面的反射率信息,辅以三维场景下的深度学习技术,能实现环境要素的认知能力。当激光线数较多且较密时,由反射率信息构成的数据可以视为一种纹理信息,可以从中获取具有匹配和识别价值的环境特征,具有较强的环境辨识能力,在一定程度上可以享受视觉算法和纹理信息带来的优势。由此可见,面阵激光传感器很好地结合了线激光传感器和视觉传感器的优点,不仅有利于提高自主移动设备对环境的空间理解力,还有利于自主移动设备在障碍物识别性能上得到质的提升,甚至使其对环境的空间理解能力达到人眼水平;另外,相对于基于图像传感器的感知方案,面阵激光传感器能够提供更为准确的距离和方向信息,可降低感知运算的复杂度,提高实时性。
当然,对面阵激光传感器而言,除了具有上述优势之外,在以下几个方面也具有较明显的优势:1)面阵激光传感器具有固态化、低成本化、小型化优势;2)面阵激光传感器在安装使用时不需要旋转部件,可以大大压缩传感器的结构和尺寸,提高使用寿命,并降低成本;3)面阵激光传感器的视角可以调节,可适配不同自主移动设备,有利于加快扫描速度与精度;4)面阵激光传感器可以同时采集水平和竖直方向上的环境信息,可以建成3D地图,有利于提高基于地图的定位、导航规划等功能的准确性。
值得说明的是,基于面阵激光传感器采集到的包含方向、距离和反射率三个维度的环境信息,可以控制自主移动设备实现各种基于环境感知的功能。例如,可以实现视觉算法上的物体识别、跟踪与分类等功能;另外,基于激光测距的高精度,还可以实现实时性强、鲁棒性强、精度高的定位和构建地图等功能,进而还可以基于构建出的高精度的环境地图对运动规划、路径导航、定位等提供全方位的支持。
其中,面阵激光传感器在水平方向和垂直方向上均有一定视场角,简称为水平视场角和垂直视场角。水平视场角是指面阵激光传感器在水平方向上能够采集信息的有效范围,垂直视场角是指面阵激光传感器在垂直方向上能够采集信息的有效范围。对一个面阵激光传感器而言,其水平视场角决定了该面阵激光传感器在水平方向上的信息采集范围;相应地,其垂直视场角决定了该面阵激光传感器在垂直方向上的信息采集范围;水平视场角和垂直视场角结合起来形成面阵激光传感器能够采集信息的有效空间范围。其中,不同面阵激光传感器的水平视场角和垂直视场角的范围有所不同。例如,有些面阵激光传感器的水平视场角是120度,垂直视场角是10度;另一些面阵激光传感器的水平视场角为90度,垂直视场角是9度。又例如,有些面阵激光传感器的水平视场角为90度,垂直视场角为10度;有些面阵激光传感器的水平视场角为85度,垂直视场角为65度;有些面阵激光传感器的水平视场角为80度,垂直视场角为70度;等等。需要说明的是,关于面阵激光传感器的水平视场角和垂直视场角的大小,还可以根据应用需求特殊定制。无论是面阵激光传感器的水平视场角和垂直视场角的大小是否是定制的,均适用于本申请实施例。
为了满足自主移动设备的100的环境感知需求,第一面阵激光传感器106的水平视场角需要满足自主移动设备100正常作业在水平方向上的视角需求。当然,第一面阵激光传感器106的垂直视场角也需要满足自主移动设备100正常作业在垂直方向上的视角需求。若第一面阵激光传感器106为1个,则其水平视场角应该大于或等于自主移动设备100正常作业在水平方向上要求的最小视角,其垂直视场角应该大于或等于自主移动设备100正常作业在垂直方向上要求的最小视角。若第一面阵激光传感器106为多个,则多个第一面阵激光传感器106的水平视场角综合起来,应该覆盖住自主移动设备100正常作业在水平方向上要求的视角;多个第一面阵激光传感器106的垂直视场角综合起来,应该覆盖住自主移动设备100正常作业在垂直方向上要求的视角。
在第一面阵激光传感器106的水平视场角和垂直视场角分别满足相应视角需求的情况下,第一面阵激光传感器106在设备本体101上的安装方式会影响其采集到的外部环境信息的质量。
在本实施例中,为了提高第一面阵激光传感器106采集到的外部环境信息的质量,提出面阵激光传感器的倾斜安装方式。倾斜安装方式也可以称为俯仰安装方式或非水平安装方式,是面阵激光传感器在垂直视场角方向上倾斜安装于自主移动设备的设备本体上的安装方式。与倾斜安装方式对应的水平安装方式,是指面阵激光传感器在垂直视场角方向上水平安装于自主移动设备的设备本体上的安装方式。对水平安装方式来说,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角为0,也就是说面阵激光传感器在垂直方向上的安装俯仰角为0;对倾斜安装方式来说,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角不为0,也就是说面阵激光传感器在垂直方向上的安装俯仰角不为0,即上扬或下俯。
为便于理解本申请实施例提出的倾斜安装方式,结合图2a和图2b对倾斜安装方式和水平安装方式进行对比性说明。其中,图2a以扫地机器人为例,示出了在倾斜安装方式下和水平安装方式下面阵激光传感器的垂直视场角的角平分线与水平线之间的关系对比示意图;图2b以迎宾机器人为例,示出了在倾斜安装方式下和水平安装方式下面阵激光传感器的垂直视场角的角平分线与水平线之间的关系对比示意图。在图2a中,三条虚线分别表示在水平安装方式下面阵激光传感器的垂直视场角的两个边界和角平分线,中间一条虚线表示在水平安装方式下面阵激光传感器的垂直视场角的角平分线,同时也表示水平线;三条不带箭头的实线表示在倾斜安装方式下面阵激光传感器的垂直视场角的两个边界和角平分线;中间不带箭头的实线表示在倾斜安装方式下面阵激光传感器的垂直视场角的角平分线。如图2a所示,在水平安装方式下面阵激光传感器的垂直视场角的角平分线(图2a中中间一条虚线)与水平线(图2a中中间一条虚线)之间的夹角为0,或者说,面阵激光传感器的垂直视场角的角平分线与水平线平行。 在图2a中,以面阵激光传感器的垂直视场角的角平分线与水平线重合为例进行图示。如图2a所示,在倾斜安装方式下面阵激光传感器的垂直视场角的角平分线(图2a中中间一条不带箭头的实线)与水平线(图2a中中间一条虚线)之间的夹角a不为0,或者说,面阵激光传感器的垂直视场角的角平分线与水平线不平行。
进一步,在图2a中,在水平安装方式下面阵激光传感器在水平线上方的信息采集范围和面阵激光传感器在水平线下方的信息采集范围相同;在图2a中,在倾斜安装方式下面阵激光传感器在水平线上方的信息采集范围和面阵激光传感器在水平线下方的信息采集范围不相同,在图2a中,在倾斜安装方式下,以面阵激光传感器的垂直视场角上扬为例进行图示,即面阵激光传感器在水平线上方的信息采集范围大于面阵激光传感器在水平线下方的信息采集范围,但并不限于此。例如,在图2b中,在倾斜安装方式下面阵激光传感器在水平线上方的信息采集范围小于面阵激光传感器在水平线下方的信息采集范围,即在倾斜安装方式下,面阵激光传感器的垂直视场角还可以下俯。在图2a-图2b中,以角度b1表示在倾斜安装方式下面阵激光传感器在水平线上方的信息采集范围;以角度b2表示在倾斜安装方式下阵激光传感器在水平线下方的信息采集范围。其中,角度b1是在倾斜安装方式下面阵激光传感器的垂直视场角的上边界与水平线之间的夹角;角度b2是在倾斜安装方式下面阵激光传感器的垂直视场角的下边界与水平线之间的夹角。
结合图2a,将水平安装方式下落入面阵激光传感器的垂直视场角范围内的外部环境信息与倾斜安装方式下落入面阵激光传感器的垂直视场角范围内的外部环境信息进行比较。可得出:与水平安装方式相比,在倾斜安装方式下的面阵激光传感器采集到的较低位置的环境信息相对较少,例如缺失了图示中凳子一部分靠近地面的凳腿信息,但却可以采集更多较高位置上的环境信息,例如增加了图示中凳子的凳面信息。与靠近地面的部分凳腿信息相比,凳面信息更加丰富,更有助于自主移动设备100感知外部 环境中的障碍物。同理,在图2b中,与水平安装方式相比,在倾斜安装方式下的面阵激光传感器可以采集更多自主移动设备的行进路径上的信息,这些信息比高出自主移动设备的高空信息更有价值。由此可知,基于倾斜安装方式,可使面阵激光传感器的垂直视场角覆盖更加合理的观测范围,有利于在垂直视场角内采集更为丰富的外部环境信息。
在本实施例中,第一面阵激光传感器106在垂直视场角方向上倾斜安装于自主移动设备100的设备本体101上,其垂直视场角的角平分线与水平线之间的夹角不为0(即不平行),这可减少第一面阵激光传感器106对非丰富信息区域的观测范围,增加其对丰富信息区域的观测范围,有利于在垂直视场角方向上采集更为丰富的外部环境信息,提高采集到的外部环境信息的质量,进而提高自主移动设备100对外部环境的感知能力,提高自主移动设备100对外部环境的感知准确度。
在本申请实施例中,并不限定第一面阵激光传感器106的倾斜安装方向和倾斜安装角度,可根据应用场景和自主移动设备100的实现形态等因素灵活设定。下面结合场景化实施例进行举例说明:
场景化实施例1:
在场景化实施例1中,自主移动设备是扫地机器人,扫地机器人上安装有面阵激光传感器,面阵激光传感器可以采集其水平视场角和垂直视场角内的环境信息。在扫地机器人执行清扫任务过程中,可根据面阵激光传感器采集到的环境信息识别待清扫区域。可选地,还可以根据面阵激光传感器采集到的环境信息进行路径规划,并沿着规划好的路径移动至待清扫区域,以及在向待清扫区域移动过程中,根据面阵激光传感器采集到的环境信息进行障碍物识别和避障。或者,扫地机器人可根据面阵激光传感器采集到的环境信息构建环境地图。或者,扫地机器人可根据面阵激光传感器采集到的环境信息进行重新定位。
假设,面阵激光传感器在垂直视场角方向上水平安装于扫地机器人上,由于扫地机器人的高度较低,面阵激光传感器的垂直视场角有很大部分落 在了地面上,水平安装方式下面阵激光传感器的垂直视场角的观测范围如图2a中上下两条虚线所示。这将导致两方面的问题:一方面,地面信息较为单一,有效环境信息较少,一定程度上造成了可观测范围的浪费;另一方面,地面的入射角较大,来自地面上的信息的准度相对较差,且受地面反射或地面上障碍物影响较多。
为解决上述问题,在场景化实施例1中,面阵激光传感器在垂直视场角方向上倾斜安装于扫地机器人上,具体地,面阵激光传感器在垂直视场角方向上向上倾斜安装于扫地机器人上。这样,面阵激光传感器在垂直视场角内的观测范围会是图2a中三条不带箭头的实线所示的样子,这可以减少非丰富信息区域(地面)的可观测范围,来增加丰富信息区域(地面上家居或障碍物)的可观测范围,有利于在垂直视场角内采集更为丰富的外部环境信息,还可以减少大入射角数据在环境信息中的占比,提高采集到的外部环境信息的质量,进而提高扫地机器人对外部环境的感知能力,提高扫地机器人对外部环境的感知准确度。
进一步,在执行清扫任务过程中,扫地机器人可基于其对外部环境的准确感知,准确地、快速地移动至待清扫区域,有利于提高清扫效率。在构建环境地图过程中,基于对外部环境的准确感知,扫地机器人可以更加准确地识别出环境中的障碍物、家具等,进而构建出高精度的三维(3D)环境地图。
进一步,考虑到地面上的信息并不是全都无效,有一部分信息对扫地机器人感知外部环境是有用的,这部分信息可以更好地辅助扫地机器人进行导航、路径规划等,这就需要采集到水平前方及水平偏下方向上的环境信息。为了能够采集到水平前方及水平偏下方向上的环境信息,需要保证扫地机器人水平前方及水平偏下方向出现在面阵激光传感器的垂直视场角范围内。基于此,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角不仅需要大于0,而且需要小于1/2的垂直视场角,即垂直视场角的下边界不能高于水平线。
进一步,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于等于1/5的垂直视场角且小于等于1/3的垂直视场角,这可以更为准确地采集地面上对扫地机器人感知外部环境有用的环境信息,又可以很大程度地避免采集到大量大入射角数据,降低大入射角数据的占比,提高采集到的外部环境信息的质量。
更进一步,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角为4/15的垂直视场角,但并不限于此。
需要说明的是,场景化实施例1中的扫地机器人代表高度较低的一类自主移动设备。除了扫地机器人,高度较低的自主移动设备还可以是擦窗机器人等。这类自主移动设备的设备本体上安装有第一面阵激光传感器;第一面阵激光传感器在垂直视场角方向上向上倾斜安装于设备本体上。进一步可选地,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于0且小于1/2的垂直视场角。更进一步,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于等于1/5的垂直视场角且小于等于1/3的垂直视场角。例如,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角为4/15的垂直视场角。
场景化实施例2:
在场景化实施例2中,自主移动设备是迎宾机器人,迎宾机器人上安装有面阵激光传感器,面阵激光传感器可采集其水平视场角和垂直视场角内的环境信息。迎宾机器人在迎宾过程中,可根据面阵激光传感器采集到的环境信息识别需要接待的用户或顾客。可选地,还可以根据面阵激光传感器采集到的环境信息进行路径规划,并沿着规划好的路径引导用户或顾客至相应区域或位置,以及在引导用户或顾客至相应区域的过程中,还可以根据面阵激光传感器采集到的环境信息进行障碍物识别和避障。
假设,面阵激光传感器在垂直视场角方向上水平安装于迎宾机器人的胸部位置,由于迎宾机器人的高度较高,面阵激光传感器的垂直视场角有很大部分落在高出迎宾机器人头部的高空位置,水平安装方式下面阵激光 传感器的垂直视场角的观测范围如图2b中上下两条虚线所示。这将导致两方面的问题:一方面,高空信息较为单一,有效环境信息很少,一定程度上造成了可观测范围的浪费;另一方面,迎宾机器人斜下方区域的有效环境信息(如图2b所示迎宾机器人行进路径上的凳子)会因不在垂直视场角的范围内被漏掉,造成信息缺失。
为解决上述问题,在场景化实施例2中,面阵激光传感器在垂直视场角方向上倾斜安装于迎宾机器人上,具体地,面阵激光传感器在垂直视场角方向上向下倾斜安装于迎宾机器人上。这样,在倾斜安装方式下面阵激光传感器在垂直视场角内的观测范围会是图2b中三条不带箭头的实线所示的样子,这可以减少非丰富信息区域(高空区域)的可观测范围,来增加丰富信息区域(迎宾机器人斜下方区域)的可观测范围,可观测迎宾机器人斜下方区域内的障碍物(如图2b中的凳子),有利于在垂直视场角内采集更为丰富的外部环境信息,提高采集到的外部环境信息的质量,进而提高迎宾机器人对外部环境的感知能力,提高迎宾机器人对外部环境的感知准确度。
进一步,考虑到未超过迎宾机器人高度的高空区域有些环境信息(例如吊灯、上门沿等)对迎宾机器人感知外部环境是有用的,例如这些高空信息可以更好地辅助迎宾机器人进行导航、避障等,这需要迎宾机器人在水平偏上方向上存在观测数据。为了保证迎宾机器人在水平偏上方向上存在观测数据,要求水平偏上方向出现在面阵激光传感器的垂直视场角范围内。基于此,面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角不仅需要大于0,而且需要小于1/2的垂直视场角,即垂直视场角的上边界不能低于水平线。
需要说明的是,场景化实施例2中的迎宾机器人代表有一定高度(例如相对较高)的一类自主移动设备。除了迎宾机器人,高度较高的自主移动设备还可以是空气净化器、仓库搬运机器人或家庭陪护机器人等。这类自主移动设备的设备本体上安装有第一面阵激光传感器;第一面阵激光传 感器在垂直视场角方向上向下倾斜安装于设备本体上。进一步可选地,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角a大于0且小于1/2的垂直视场角。更进一步,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角a大于等于1/5的垂直视场角且小于等于1/3的垂直视场角。例如,第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角a为4/15的垂直视场角。需要说明的是,在本申请实施例中,夹角a、b1、以及b2仅表示大小,不带有方向性。
在本申请各实施例中,并不限定第一面阵激光传感器106的数量,可以是1个,也可以是多个。多个是指两个或两个以上。在实际应用中,存在一些简单的应用需求,只需要单个面阵激光传感器就可以解决环境感知问题,则对于工作在这些环境中的自主移动设备100来说,可以设置1个第一面阵激光传感器106。另外,也存在一些复杂的应用需求,需要多个面阵激光传感器来解决环境感知问题,则对于工作在这些环境中的自主移动设备100来说,可设置多个第一面阵激光传感器106。
在第一面阵激光传感器106为1个的情况下,第一面阵激光传感器106可以安装于设备本体101的前侧,前侧是自主移动设备100向前移动过程中其设备本体101朝向的一侧。自主移动设备在移动过程中可能向前移动,也有可能向后移动(简称为后退)。这里的“向前”可以理解为:自主移动设备在作业过程中经常性或绝大多数情况下的移动方向。其中,第一面阵激光传感器106设置在自主移动设备100的设备本体101的前侧,可以更加方便、准确地在自主移动设备100移动过程中采集前方的环境信息,以便自主移动设备100在行进过程中更加准确的避障,顺利行进。
当然,第一面阵激光传感器106也可以安装于设备本体101的后侧或侧面,具体可根据应用需求灵活而定。其中,后侧是相对于前侧而言的,侧面是设备本体101上位于前侧与后侧之间的位置区域。
对自主移动设备100来说,一般会具有一定高度。这就涉及第一面阵 激光传感器106在设备本体101高度方向上安装在哪个位置(即安装高度)的问题。在本实施例中,并不限定第一面阵激光传感器106在自主移动设备100的设备本体101上的安装高度,可以根据应用需求和自主移动设备100的高度灵活选择第一面阵激光传感器106在设备本体101的安装高度。例如,第一面阵激光传感器106可以安装在设备本体101前侧的顶部、底部或中部位置。
在第一面阵激光传感器106为多个的情况下,多个第一面阵激光传感器106可安装于设备本体101的不同位置。例如,多个第一面阵激光传感器106可以环绕设备本体101安装。采用多个第一面阵激光传感器106,有利于增加自主移动设备100在水平方向上的观测范围。
值得说明的是,多个第一面阵激光传感器106在设备本体101上的安装位置不同,除此之外,还需要考虑多个第一面阵激光传感器106在设备本体101高度方向上安装在哪个位置(即安装高度)的问题。本实施例并不对此做限定,可以根据应用需求和自主移动设备100的高度灵活选择多个第一面阵激光传感器106在设备本体101上的安装高度。
例如,部分第一面阵激光传感器106在设备本体101上的安装高度相同,则可以保证在该高度位置采集到更为丰富的环境信息。
又例如,各第一面阵激光传感器106在设备本体101上的安装高度均不同,这样可以保证采集到不同高度位置处的环境信息,提高环境信息的丰富程度。
又例如,各第一面阵激光传感器106在设备本体101上的安装高度均相同,这样可以保证在该高度位置采集到更为丰富的环境信息。
在第一面阵激光传感器106为多个的情况下,为了提高采集到的环境信息的丰富度,多个第一面阵激光传感器106可满足水平视场角的连续性要求。水平视场角的连续性要求可以是要求多个第一面阵激光传感器106的水平视场角之间连续、基本连续或达到设定的连续度等。关于连续度的定义可参见后续实施例,在此不再赘述。基于此,多个第一面阵激光传感 器106在设备本体101上的安装位置需满足水平视场角的连续性要求。例如,在一可选实施例中,其中,一个第一面阵激光传感器106安装于设备本体101的前侧,其它第一面阵激光传感器106按照水平视场角的连续性要求依次安装于设备本体101的其它位置。
可选地,为了满足水平视场角的连续性要求,相邻两个第一面阵激光传感器106的水平视场角之间可满足但不限于以下任一要求:
相邻两个第一面阵激光传感器106的水平视场角的最远视距端相交;或者
相邻两个第一面阵激光传感器106的水平视场角的边界平行;或者
相邻两个第一面阵激光传感器106的水平视场角的边界在指定位置处相交;其中,指定位置由相邻两个第一面阵激光传感器106的水平视场角之间的干扰要求决定。
在上述各要求中,相邻两个第一面阵激光传感器106在各自水平视场角内采集到的环境信息不存在重叠,彼此之间不会相互干扰;而且相邻两个第一面阵激光传感器106采集的环境信息具有连续性,对环境特征的拼接和提取具有了一定的优势。
在一种应用场景中,第一面阵激光传感器106为2个,则根据两个第一面阵激光传感器106的水平视场角的范围,可将一个传感器安装于设备本体101的前侧,将另一个传感器安装于设备本体101的侧面,这样既可以满足水平视场角连续要求,又可避免两个传感器之间相互干扰。
在本申请一些实施例中,第一面阵激光传感器106的垂直视场角的下边界一般会与自主移动设备所在的承载面相交。为便于描述和区分,将第一面阵激光传感器106的垂直视场角的下边界与自主移动设备所在承载面相交的位置记为第一相交位置。为了便于采集更加丰富的环境信息,可以预先设定第一距离阈值,并要求第一相交位置到自主移动设备的距离大于设定的第一距离阈值,如图2a和图2b中所示距离L1。
在本申请实施例中,并不限定第一距离阈值的具体取值,可根据应用 需求、自主移动设备的高度以及第一面阵激光传感器的垂直视场角的大小等因素灵活设定。例如,第一距离阈值的范围可以是60-100cm,例如可以是60cm、70cm、80cm、90cm等,但并不限于此。
需要说明的是,在本申请实施例中,第一面阵激光传感器在垂直视场角方向上的倾斜角度、第一面阵激光传感器在自主移动设备上的安装高度以及第一距离阈值,在这三个信息中,只要确定其中任意两个信息,就可以计算出第三个信息。例如,可以根据第一面阵激光传感器在垂直视场角方向上的倾斜角度和第一距离阈值,计算出第一面阵激光传感器在自主移动设备上的安装高度;同理,可以根据第一面阵激光传感器在自主移动设备上的安装高度和第一距离阈值,计算出第一面阵激光传感器在垂直视场角方向上的倾斜角度。
另外,需要说明的是,根据自主移动设备所在作业环境的不同,自主移动设备所在承载面可以有不同实现。本申请实施例并不对承载面的具体实现进行限定,凡是能够承载自主移动设备的平面或非平面等都可以作为本申请实施例中的承载面。例如,对于工作在地面上的自主移动设备,该地面即为自主移动设备所在的承载面。又例如,对于工作在桌面上的自主移动设备,该桌面即为自主移动设备所在的承载面。又例如,对于工作在车厢内的自主移动设备,该车厢的底板即为自主移动设备所在的承载面。又例如,对工作在屋顶上的自主移动设备,该屋顶即为自主移动设备所在的承载面。这里有关承载面的说明,同样适用于本申请其它实施例。
如图3所示,在本申请另一示例性实施例中,自主移动设备100还包括安装于设备本体101上的第二面阵激光传感器107。其中,第二面阵激光传感器107属于面阵激光传感器,关于面阵激光传感器的结构、工作原理以及优势等介绍可参见前述实施例,在此不再赘述。在本实施例中,第二面阵激光传感器107与第一面阵激光传感器106的区别在于:第二面阵激光传感器107在垂直视场角方向上水平安装于设备本体101上,也就是 说,第二面阵激光传感器107的垂直视场角的角平分线与水平线之间的夹角为0,或者说,第二面阵激光传感器107的垂直视场角的角平分线与水平线平行。
在本实施例中,并不限定第二面阵激光传感器107的数量,可以是1个,也可以是多个。
在本实施例中,并不限定第一面阵激光传感器106和第二面阵激光传感器107在设备本体101上的安装位置和安装高度。例如,第一面阵激光传感器106和第二面阵激光传感器107在设备本体101上的安装高度相同。或者,所有第一面阵激光传感器106在设备本体101上的安装高度相同,所有第二面阵激光传感器107在设备本体101上的安装高度相同,但第一面阵激光传感器106在设备本体101上的安装高度与第二面阵激光传感器107在设备本体101上的安装高度不相同。第一面阵激光传感器106和第二面阵激光传感器107可以均匀分布在设备本体101上。或者,第一面阵激光传感器106和第二面阵激光传感器107可以间隔安装于设备本体101上。或者,第一面阵激光传感器106安装于设备本体101的前侧,第二面阵激光传感器107安装于设备本体101的后侧。或者,第一面阵激光传感器106安装于设备本体101的前侧,第二面阵激光传感器107安装于设备本体101的侧面。或者,第二面阵激光传感器107安装于设备本体101的前侧,第一面阵激光传感器106安装于设备本体101的侧面。
在图3所示实施例中,自主移动设备101包括多个面阵激光传感器,部分面阵激光传感器采用倾斜安装方式安装于设备本体101上,部分面阵激光传感器采用水平安装方式安装于设备本体101上。为了便于区分和描述,将采用倾斜安装方式安装于设备本体101上的面阵激光传感器称为第一面阵激光传感器;将采用水平安装方式安装于设备本体101上的面阵激光传感器称为第二面阵激光传感器。在该实施例中,将水平安装方式与倾斜安装方式相结合,可更加全面地采集垂直方向上的环境信息,进一步提高自主移动设备101的环境感知能力。
在一些实施例中,第二面阵激光传感器107的垂直视场角的下边界一般会与自主移动设备所在的承载面相交。为便于描述和区分,将第二面阵激光传感器107的垂直视场角的下边界与自主移动设备所在承载面相交的位置记为第二相交位置。为了便于采集更加丰富的环境信息,可以预先设定第二距离阈值,并要求第二相交位置到自主移动设备的距离大于设定的第二距离阈值。
在本申请实施例中,并不限定第二距离阈值的具体取值,可根据应用需求、自主移动设备的高度以及第二面阵激光传感器的垂直视场角的大小等因素灵活设定。例如,第二距离阈值的范围可以是50-90cm,例如可以是50cm、60cm、70cm、80cm等,但并不限于此。需要说明的是,第二距离阈值与第一距离阈值的取值可以相同,也可以不相同。对于同一类型的面阵激光传感器,若第一面阵激光传感器在垂直视场角方向上向上倾斜安装于设备本体,则第一距离阈值大于第二距离阈值;若第一面阵激光传感器在垂直视场角方向上向下倾斜安装于设备本体,则第一距离阈值小于第二距离阈值。
在本实施例中,在第二距离阈值确定的情况下,根据第二相交位置到自主移动设备的距离大于第二距离阈值的要求,一定程度上可以确定出第二面阵激光传感器在设备本体上的安装高度范围。
图4a为本申请示例性实施例提供的又一种自主移动设备的硬件结构示意图。如图4a所示,该自主移动设备400包括:设备本体401,设备本体401上设置有一个或多个处理器402以及一个或多个存储器403。
一个或多个存储器403,主要用于存储计算机程序,该计算机程序可被一个或多个处理器402执行,致使一个或多个处理器402控制自主移动设备400执行相应任务。除了存储计算机程序之外,一个或多个存储器403还可被配置为存储其它各种数据以支持在自主移动设备400上的操作。这些数据的示例包括用于在自主移动设备400上操作的任何应用程序或方法 的指令,自主移动设备400所在环境/场景的地图数据,工作模式,工作参数等等。
一个或多个处理器402,可以看作是自主移动设备400的控制系统,可用于执行一个或多个存储器403中存储的计算机指令,以控制自主移动设备400执行相应任务。
进一步,设备本体401上除了设有一个或多个处理器402以及一个或多个存储器403之外,还设置或安装有自主移动设备400的一些基本组件,例如电源组件404、驱动组件405等等。可选地,驱动组件405可以包括驱动轮、驱动电机、万向轮等。
在此说明,不同自主移动设备400所包含的这些基本组件以及基本组件的构成均会有所不同,本申请实施例仅是部分示例。另外,不同自主移动设备400所处的环境、需要执行的任务以及所能实现的功能等都会有所不同。
在本实施例中,并不对自主移动设备400进行限定,自主移动设备400可以是任何能够在其所处环境中高度自主地进行空间移动的机械设备,例如,可以是无人车、无人机、机器人或空气净化器等。其中,自主移动设备400可以是清扫型机器人、其它服务型机器人等各类机器人。清扫型机器人是指能够在其作业环境中自主执行清扫任务的机器人,包括扫地机器人、擦窗机器人等。其它服务型机器人是指能够在其作业环境中自主移动并提供非清扫服务的机器人,包括家庭陪护机器人、迎宾机器人或仓库搬运机器人等。
当然,根据自主移动设备400实现形态的不同,自主移动设备400的形状也会有所不同。本实施例并不限定自主移动设备400的实现形态。以自主移动设备400的外轮廓形状为例,自主移动设备400的外轮廓形状可以是不规则形状,也可以是一些规则形状。例如,自主移动设备400的外轮廓形状可以是圆形、椭圆形、方形、三角形、水滴形或D形等规则形状。规则形状之外的称为不规则形状,例如人形机器人的外轮廓、无人车的外 轮廓以及无人机的外轮廓等属于不规则形状。
无论是哪种形态的自主移动设备400,为了能够在其所处环境中自主移动,需要感知其所处的外部环境。在本实施例中,自主移动设备400还包括第一面阵激光传感器406和第二面阵激光传感器407。第一面阵激光传感器406和第二面阵激光传感器407安装于设备本体401上,主要采集自主移动设备400所处外部环境中的环境信息,并将采集到的外部环境信息传输给一个或多个处理器402;一个或多个处理器402可根据第一面阵激光传感器406采集到的外部环境信息感知外部环境,进而对自主移动设备400进行各种控制,以使自主移动设备400执行相应任务或实现相应功能。
其中,第一面阵激光传感器106和第二面阵激光传感器407均属于面阵激光传感器,为了便于区分和描述,在“面阵激光传感器”前面冠以“第一”和“第二”,这里的“第一”、“第二”既不表示数量,也不代表先后顺序。关于面阵激光传感器的结构、工作原理以及优势等介绍可参见前述实施例,在本实施例中不再赘述。
在本实施例中,采用两个阵激光传感器,可以弥补了单个面阵激光传感器观测数据比较单一的不足。另外,考虑到第一面阵激光传感器406和第二面阵激光传感器407的水平视场角具有一定范围,在安装位置上,第一面阵激光传感器406和第二面阵激光传感器407在水平视场角方向上上尽量靠近,以使两个面阵激光传感器的水平视场角满足连续性要求。其中,水平视场角的连续性要求是指两个面阵激光传感器的水平视场角所覆盖的水平观测范围连续或尽量连续,或达到一定连续度。连续度的定义可以是:在两个面阵激光传感器的水平视场角所覆盖的水平观测范围不连续的情况下,两个水平观测范围之间未被覆盖的区域范围的面积与两个水平观测范围的面积之和的比值。达到一定连续度可是要求连续度大于设定的连续度阈值。关于连续度的定义,并不限于这一种方式。
如图4b所示,在本实施例中,第一面阵激光传感器406安装于设备本 体401的前侧,第二面阵激光传感器407安装于设备本体401的侧面,相较于第一面阵激光传感器406和第二面阵激光传感器407对称安装于设备本体401的前后两侧或左右两侧的方案,这种安装方式下两个面阵激光传感器之间的位置相对更近,一定程度上可保证两个面阵激光传感器的水平视场角所覆盖的水平观测范围满足连续性要求。关于前侧、侧面的定义可参见前述实施例,在此不再赘述。
需要说明的是,设备本体401的侧面实际上是一个区域范围,是设备本体401的前侧与后侧之间的区域范围,第二面阵激光传感器407具体安装在侧面区域的哪个位置,对此不做限定,以两个水平视场角满足连续性要求为准。可选地,如图4b所示,第一面阵激光传感器406和第二面阵激光传感器407的安装位置可以成直角关系,但不限于此。
这种前侧与侧面相结合的安装方式,既可以采集到自主移动设备400前方关键性的环境信息,对自主移动设备400的环境感知、路径规划、避障等提供信息上优势,又可以降低两个面阵激光传感器采集到的环境信息的重复度,提高环境信息的丰富度,解决单个面阵激光传感器信息丰富度不足的问题;再者,安装于设备本体401前侧的第一面阵激光传感器406,还可以为自主移动设备400起到前方缓冲的作用,减少对红外缓冲等前方缓冲装置的依赖。
其中,若自主移动设备400不支持沿边模式,则第二面阵激光传感器407可以安装于设备本体401的左侧面,也可以安装于设备本体401的右侧面。若自主移动设备400支持沿边模式,则可以结合自主移动设备400所支持的沿边模式,确定第二面阵激光传感器407是安装于设备本体401的左侧面或右侧面。
若自主移动设备400支持右沿边模式,第二面阵激光传感器407设置于设备本体401的左侧面;若自主移动设备400支持左沿边模式,第二面阵激光传感器407设置于设备本体401的右侧面。其中,在第二面阵激光传感器407在左右侧面的选择上,考虑到自主移动设备400的沿边模式, 选择与自主移动设备400支持的沿边模式相反的侧面,这样自主移动设备400工作在所支持的沿边模式时,第二面阵激光传感器407被挡住的角度相对较少,有利于为自主移动设备400提供尽可能更多的环境信息。
其中,沿边模式是指自主移动设备400在碰到固定物体,例如墙壁、橱柜、衣柜等时,沿着固定物体的边沿继续执行任务的模式。左沿边模式是指自主移动设备400左侧面沿着固定物体的边沿继续执行任务的模式。右沿边模式是指自主移动设备400右侧面沿着固定物体的边沿继续执行任务的模式。
在本申请实施例中,在安装位置上,可根据第一面阵激光传感器406和第二面阵激光传感器407的水平视场角范围,让第一面阵激光传感器406和第二面阵激光传感器407的安装位置尽量靠近,以使得平视场角之间满足连续性要求。
但是,如果第一面阵激光传感器406和第二面阵激光传感器407的安装位置过于靠近,两个面阵激光传感器的水平视场角相互交叠,如果两个水平视场角的交叠范围超过一定程度,两个面阵激光传感器之间会产生相互干扰,因此,在让第一面阵激光传感器406和第二面阵激光传感器407的安装位置尽量靠近的同时,还需要避免第一面阵激光传感器406和第二面阵激光传感器407之间产生相互干扰。
基于上述考虑,在安装第一面阵激光传感器406和第二面阵激光传感器407的过程中,可按照下述任一方式来确定第一面阵激光传感器406和第二面阵激光传感器407在设备本体401上的安装位置。换句话说,在第一面阵激光传感器406和第二面阵激光传感器407被成功安装后,第一面阵激光传感器406和第二面阵激光传感器407的水平视场角满足以下任一方式中的要求,采用这些方式既可以保证水平视场角之间满足连续性要求,又可以避免水平视场角之间交叠过度引起两个面阵激光传感器之间产生相互干扰的问题。
方式1:第一面阵激光传感器406和第二面阵激光传感器407的水平 视场角的最远视距端相交,如图4c所示。这样第一面阵激光传感器406和第二面阵激光传感器407在各自水平视场角内采集到的环境信息不存在重叠,彼此之间不会相互干扰;而且两个面阵激光传感器的水平视场角的最远视距端相交,使得两个面阵激光传感器采集的环境信息具有了连续性,对环境特征的拼接和提取具有了一定的优势。
方式2:第一面阵激光传感器406和第二面阵激光传感器407的水平视场角的边界平行,如图4d所示。这样第一面阵激光传感器406和第二面阵激光传感器407在各自水平视场角内采集到的环境信息不存在重叠,彼此之间不会相互干扰;而且两个面阵激光传感器的水平视场角平行,使得两个面阵激光传感器采集的环境信息在水平方向上具有了连续性,对环境特征的拼接和提取具有了一定的优势。
方式3:第一面阵激光传感器406和第二面阵激光传感器407的水平视场角的边界在指定位置处相交,如图4e所示。其中,指定位置由第一面阵激光传感器406和第二面阵激光传感器407需要满足的最短观测距离决定。为便于与方式1中的最远视距端区分,方式3中的指定位置可以是水平视场角的边界线上除最短视距端之外的其它位置。即,只要在两个面阵激光传感器需要满足的最短观测距离内,两个面阵激光传感器的水平视场角不存在交叠,即可满足两个面阵激光传感器之间不能相互干扰的要求,同时,两个水平视场角的边界在指定位置处相交,可使两个面阵激光传感器采集的环境信息在水平方向上具有连续性,对环境特征的拼接和提取具有一定的优势。
在本申请实施例中,并未限定第一面阵激光传感器406和第二面阵激光传感器407在垂直视场角方向上的安装方式。可选地,第一面阵激光传感器406和/或第二面阵激光传感器407在垂直视场角方向上可以倾斜安装于设备本体上。
若第一面阵激光传感器406在垂直视场角方向上倾斜安装于设备本体上,则第一面阵激光传感器406在垂直视场角方向上向上倾斜安装于设备 本体上,或者第一面阵激光传感器406在垂直视场角方向上向下倾斜安装于设备本体上,具体视应用需求而定。
若第二面阵激光传感器407在垂直视场角方向上倾斜安装于设备本体上,则第二面阵激光传感器407在垂直视场角方向上向上倾斜安装于设备本体上,或者第二面阵激光传感器407在垂直视场角方向上向下倾斜安装于设备本体上,具体视应用需求而定。
若第一面阵激光传感器406和第二面阵激光传感器407都在垂直视场角方向上倾斜安装于设备本体上,则两个面阵激光传感器都在垂直视场角方向上向上倾斜安装于设备本体上,或者,两个面阵激光传感器都在垂直视场角方向上向下倾斜安装于设备本体上,或者,一个在垂直视场角方向上向上倾斜安装于设备本体上,另一个在垂直视场角方向上向下倾斜安装于设备本体上。
关于第一面阵激光传感器406和第二面阵激光传感器407的安装高度的相关说明,可参见前述实施例,在此不再赘述。
在本申请一些实施例中,第一面阵激光传感器406和第二面阵激光传感器407的垂直视场角的下边界一般会与自主移动设备所在的承载面相交。为便于描述和区分,将第一面阵激光传感器406的垂直视场角的下边界与自主移动设备所在承载面相交的位置记为第一相交位置;将第二面阵激光传感器407的垂直视场角的下边界与自主移动设备所在承载面相交的位置记为第二相交位置。
为了便于采集更加丰富的环境信息,可以预先设定第一距离阈值,并要求第一相交位置到自主移动设备的距离大于设定的第一距离阈值;同理,预先设定第二距离阈值,并要求第二相交位置到自主移动设备的距离大于设定的第二距离阈值。
在本申请实施例中,并不限定第一距离阈值和第二距离阈值的具体取值。例如,可根据应用需求、自主移动设备的高度以及第一面阵激光传感器的垂直视场角的大小等因素灵活设定第一距离阈值;同理,可根据应用 需求、自主移动设备的高度以及第二面阵激光传感器的垂直视场角的大小等因素灵活设定第二距离阈值。
需要说明的是,第二距离阈值与第一距离阈值可以相同,也可以不相同。若第一面阵激光传感器和第二面阵激光传感器为同一类型的面阵激光传感器,且第一面阵激光传感器和第二面阵激光传感器在垂直视场角方向上均采用水平安装方式,则第一距离阈值和第二距离阈值的取值可以相同。若第一面阵激光传感器和第二面阵激光传感器为不同类型的面阵激光传感器,第一距离阈值和第二距离阈值可以取不同的值。当然,对于第一面阵激光传感器和第二面阵激光传感器为同一类型的情况,若两个面阵激光传感器在垂直视场角方向上的安装方式不同,第一距离阈值和第二距离阈值也可以取不同的值。
例如,对于第一面阵激光传感器和第二面阵激光传感器为同一类型的面阵激光传感器的情况,若第一面阵激光传感器在垂直视场角方向上向上倾斜安装于设备本体,则第一距离阈值大于第二距离阈值;若第一面阵激光传感器在垂直视场角方向上向下倾斜安装于设备本体,则第一距离阈值小于第二距离阈值。
在此说明,在本申请上述各实施例中,自主移动设备100或400的设备本体上除了安装有面阵激光传感器之外,还可以设置或安装有非面阵激光传感器,非面阵激光传感器可辅助面阵激光传感器采集更为丰富的环境信息。可选地,非面阵激光传感器可以包括超声波传感器、红外传感器、视觉传感器、单线激光传感器和多线激光传感器中的一种或任意组合。综合各种传感器采集到的环境信息,可以进一步提高环境感知的准确度和精度,有利于进一步提高功能控制的精准度。
本领域内的技术人员应明白,本发明的实施例可提供为方法、系统、或计算机程序产品。因此,本发明可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本发明可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不 限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本发明是参照根据本发明实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相 变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。
以上所述仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (25)

  1. 一种自主移动设备,其特征在于,包括:设备本体和安装于所述设备本体上的第一面阵激光传感器;所述第一面阵激光传感器在垂直视场角方向上倾斜安装于所述设备本体上。
  2. 根据权利要求1所述的设备,其特征在于,所述第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角不为0。
  3. 根据权利要求2所述的设备,其特征在于,所述第一面阵激光传感器在垂直视场角方向上向上倾斜安装于所述设备本体上。
  4. 根据权利要求3所述的设备,其特征在于,所述第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于0且小于1/2的垂直视场角。
  5. 根据权利要求4所述的设备,其特征在于,所述第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于等于1/5的垂直视场角且小于等于1/3的垂直视场角。
  6. 根据权利要求5所述的设备,其特征在于,所述第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角为4/15的垂直视场角。
  7. 根据权利要求2所述的设备,其特征在于,所述第一面阵激光传感器在垂直视场角方向上向下倾斜安装于所述设备本体上。
  8. 根据权利要求7所述的设备,其特征在于,所述第一面阵激光传感器的垂直视场角的角平分线与水平线之间的夹角大于0且小于1/2的垂直视场角。
  9. 根据权利要求1所述的设备,其特征在于,所述第一面阵激光传感器的数量为多个。
  10. 根据权利要求9所述的设备,其特征在于,一个第一面阵激光传感器安装于所述设备本体的前侧,其它第一面阵激光传感器按照水平视场角的连续性要求依次安装于所述设备本体的其它位置;所述前侧是所述自 主移动设备向前移动过程中所述设备本体朝向的一侧。
  11. 根据权利要求10所述的设备,其特征在于,相邻两个第一面阵激光传感器的水平视场角的最远视距端相交;或者
    相邻两个第一面阵激光传感器的水平视场角的边界平行;或者
    相邻两个第一面阵激光传感器的水平视场角的边界在指定位置处相交;
    其中,所述指定位置由相邻两个第一面阵激光传感器需要满足的最短观测距离决定。
  12. 根据权利要求1-11任一项所述的设备,其特征在于,第一相交位置到所述自主移动设备的距离大于设定的第一距离阈值;所述第一相交位置是所述第一面阵激光传感器的垂直视场角的下边界与所述自主移动设备所在承载面相交的位置。
  13. 根据权利要求1-11任一项所述的设备,其特征在于,还包括:安装于所述设备本体上的第二面阵激光传感器,所述第二面阵激光传感器在垂直视场角方向上水平安装于所述设备本体上。
  14. 根据权利要求13所述的设备,其特征在于,所述第二面阵激光传感器的垂直视场角的角平分线与水平线平行。
  15. 根据权利要求14所述的设备,其特征在于,所述第二面阵激光传感器的数量为多个。
  16. 根据权利要求13所述的设备,其特征在于,所述第一面阵激光传感器和所述第二面阵激光传感器在所述设备本体上的安装高度相同。
  17. 根据权利要求13所述的设备,其特征在于,所述自主移动设备为机器人、空气净化器、无人车或无人机。
  18. 根据权利要求17所述的设备,其特征在于,所述机器人为扫地机器人、擦窗机器人、迎宾机器人、家庭陪护机器人或仓库搬运机器人。
  19. 根据权利要求12所述的设备,其特征在于,第二相交位置到所述自主移动设备的距离大于设定的第二距离阈值;所述第二相交位置是所述 第二面阵激光传感器的垂直视场角的下边界与所述自主移动设备所在承载面相交的位置。
  20. 一种自主移动设备,其特征在于,包括:设备本体和安装于所述设备本体上的第一面阵激光传感器和第二面阵激光传感器;所述第一面阵激光传感器安装于所述设备本体的前侧,所述第二面阵激光传感器安装于所述设备本体的侧面。
  21. 根据权利要求20所述的设备,其特征在于,若所述自主移动设备支持右沿边模式,所述第二面阵激光传感器设置于所述设备本体的左侧面;或者,若所述自主移动设备支持左沿边模式,所述第二面阵激光传感器设置于所述设备本体的右侧面。
  22. 根据权利要求20或21所述的设备,其特征在于,所述第一面阵激光传感器和所述第二面阵激光传感器的水平视场角的最远视距端相交;或者
    所述第一面阵激光传感器和所述第二面阵激光传感器的水平视场角的边界平行;或者
    所述第一面阵激光传感器和所述第二面阵激光传感器的水平视场角的边界在指定位置处相交;其中,所述指定位置由所述第一面阵激光传感器和所述第二面阵激光传感器需要满足的最短观测距离决定。
  23. 根据权利要求20或21所述的设备,其特征在于,所述第一面阵激光传感器和/或所述第二面阵激光传感器在垂直视场角方向上倾斜安装于所述设备本体上。
  24. 根据权利要求20或21所述的设备,其特征在于,第一相交位置到所述自主移动设备的距离大于设定的第一距离阈值;所述第一相交位置是所述第一面阵激光传感器的垂直视场角的下边界与所述自主移动设备所在承载面相交的位置。
  25. 根据权利要求20或21所述的设备,其特征在于,第二相交位置到所述自主移动设备的距离大于设定的第二距离阈值;所述第二相交位置 是所述第二面阵激光传感器的垂直视场角的下边界与所述自主移动设备所在承载面相交的位置。
PCT/CN2019/102789 2019-08-09 2019-08-27 自主移动设备 WO2021026965A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP19941507.6A EP4011566A4 (en) 2019-08-09 2019-08-27 AUTONOMOUS MOBILE DEVICE

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910735857.0A CN112338908B (zh) 2019-08-09 2019-08-09 自主移动设备
CN201910735857.0 2019-08-09

Publications (1)

Publication Number Publication Date
WO2021026965A1 true WO2021026965A1 (zh) 2021-02-18

Family

ID=74366990

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/102789 WO2021026965A1 (zh) 2019-08-09 2019-08-27 自主移动设备

Country Status (4)

Country Link
US (1) US20210041884A1 (zh)
EP (1) EP4011566A4 (zh)
CN (2) CN112338908B (zh)
WO (1) WO2021026965A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11399320B2 (en) * 2020-08-03 2022-07-26 Blue Ocean Robotics Aps Methods of connecting to communications networks and switching network connectivity

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080045523A (ko) * 2006-11-20 2008-05-23 엘지전자 주식회사 센싱장치 및 이를 이용한 로봇 청소기
CN102038470A (zh) * 2009-10-09 2011-05-04 泰怡凯电器(苏州)有限公司 自移动地面处理机器人及其贴边地面处理的控制方法
CN105286729A (zh) * 2015-09-25 2016-02-03 江苏美的清洁电器股份有限公司 扫地机器人
CN106716285A (zh) * 2016-06-30 2017-05-24 深圳市大疆创新科技有限公司 农业无人机作业方法、系统及农业无人机
CN106963296A (zh) * 2017-03-31 2017-07-21 湖北工程学院 扫地机及扫地机控制方法
CN107395929A (zh) * 2017-08-15 2017-11-24 宜科(天津)电子有限公司 基于面阵ccd/cmos的360°检测传感器及检测方法
CN107544073A (zh) * 2017-08-29 2018-01-05 北醒(北京)光子科技有限公司 一种飞行器探测方法及高度控制方法
TW201811253A (zh) * 2016-09-23 2018-04-01 世擘股份有限公司 自動清潔裝置、自動清潔系統以及自動充電方法

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8994822B2 (en) * 2002-08-28 2015-03-31 Visual Intelligence Lp Infrastructure mapping system and method
CN100543413C (zh) * 2008-06-20 2009-09-23 北京大学 一种数字航空摄影系统
US8260539B2 (en) * 2010-05-12 2012-09-04 GM Global Technology Operations LLC Object and vehicle detection and tracking using 3-D laser rangefinder
AU2011352997B2 (en) * 2010-12-30 2015-06-18 Irobot Corporation Mobile human interface robot
CN103412313B (zh) * 2013-07-30 2015-03-25 桂林理工大学 低空轻小型面阵激光雷达测量系统
JP6135481B2 (ja) * 2013-11-28 2017-05-31 トヨタ自動車株式会社 自律移動体
JP2017503267A (ja) * 2013-12-18 2017-01-26 アイロボット コーポレイション 自律移動ロボット
JP6938371B2 (ja) * 2014-09-09 2021-09-22 レッダーテック インコーポレイテッド 検出ゾーンの離散化
GB201419883D0 (en) * 2014-11-07 2014-12-24 F Robotics Acquisitions Ltd Domestic robotic system and method
CN104914442A (zh) * 2015-05-26 2015-09-16 芜湖航飞科技股份有限公司 一种机载激光雷达航测技术
US9919425B2 (en) * 2015-07-01 2018-03-20 Irobot Corporation Robot navigational sensor system
US10761195B2 (en) * 2016-04-22 2020-09-01 OPSYS Tech Ltd. Multi-wavelength LIDAR system
TWI653964B (zh) * 2016-05-17 2019-03-21 Lg電子股份有限公司 行動機器人及其控制方法
JP6757423B2 (ja) * 2016-05-20 2020-09-16 エルジー エレクトロニクス インコーポレイティド ロボット掃除機
DE102016213980A1 (de) * 2016-07-29 2018-02-01 Robert Bosch Gmbh Optische Anordnung für ein LiDAR-System, LiDAR-System und Arbeitsvorrichtung
CN108341201A (zh) * 2017-01-16 2018-07-31 浙江国自机器人技术有限公司 一种用于物流仓储搬运货物的自动引导设备
CN207457505U (zh) * 2017-10-19 2018-06-05 深圳市欢创科技有限公司 一种光学测距装置
CN107607960A (zh) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 一种光学测距的方法及装置
CN107678040B (zh) * 2017-11-03 2023-09-26 长春理工大学 用于车载三维成像固态激光雷达系统
JP2019100855A (ja) * 2017-12-01 2019-06-24 パイオニア株式会社 制御装置、検知装置、制御方法、プログラム及び記憶媒体
JP6868544B2 (ja) * 2017-12-19 2021-05-12 鹿島建設株式会社 自律移動方法及び自律移動装置
CN110018684A (zh) * 2018-01-06 2019-07-16 深圳慎始科技有限公司 一种固态雷达清扫机器人
CN108247647B (zh) * 2018-01-24 2021-06-22 速感科技(北京)有限公司 一种清洁机器人
EP3842885A4 (en) * 2018-08-22 2021-11-17 Ecovacs Robotics Co., Ltd. AUTONOMOUS MOVEMENT DEVICE, CONTROL PROCESS AND STORAGE SUPPORT
CN109079788B (zh) * 2018-08-22 2020-04-14 厦门理工学院 一种基于人形机器人的国际象棋下棋方法及人形机器人
CN110068836B (zh) * 2019-03-20 2024-02-02 同济大学 一种智能驾驶电动清扫车的激光雷达路沿感知系统

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080045523A (ko) * 2006-11-20 2008-05-23 엘지전자 주식회사 센싱장치 및 이를 이용한 로봇 청소기
CN102038470A (zh) * 2009-10-09 2011-05-04 泰怡凯电器(苏州)有限公司 自移动地面处理机器人及其贴边地面处理的控制方法
CN105286729A (zh) * 2015-09-25 2016-02-03 江苏美的清洁电器股份有限公司 扫地机器人
CN106716285A (zh) * 2016-06-30 2017-05-24 深圳市大疆创新科技有限公司 农业无人机作业方法、系统及农业无人机
TW201811253A (zh) * 2016-09-23 2018-04-01 世擘股份有限公司 自動清潔裝置、自動清潔系統以及自動充電方法
CN106963296A (zh) * 2017-03-31 2017-07-21 湖北工程学院 扫地机及扫地机控制方法
CN107395929A (zh) * 2017-08-15 2017-11-24 宜科(天津)电子有限公司 基于面阵ccd/cmos的360°检测传感器及检测方法
CN107544073A (zh) * 2017-08-29 2018-01-05 北醒(北京)光子科技有限公司 一种飞行器探测方法及高度控制方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4011566A4 *

Also Published As

Publication number Publication date
US20210041884A1 (en) 2021-02-11
CN112338908A (zh) 2021-02-09
EP4011566A1 (en) 2022-06-15
CN115122323A (zh) 2022-09-30
CN112338908B (zh) 2022-07-22
EP4011566A4 (en) 2022-10-26

Similar Documents

Publication Publication Date Title
US10705535B2 (en) Systems and methods for performing simultaneous localization and mapping using machine vision systems
JP6946524B2 (ja) 機械視覚システムを使用した、同時位置測定マッピングを実施するためのシステム
US10611023B2 (en) Systems and methods for performing occlusion detection
EP3951544A1 (en) Robot working area map constructing method and apparatus, robot, and medium
CN110023867B (zh) 用于机器人绘图的系统和方法
CN110675307B (zh) 基于vslam的3d稀疏点云到2d栅格图的实现方法
CN109074083B (zh) 移动控制方法、移动机器人及计算机存储介质
JP2022546289A (ja) 掃除ロボット及び掃除ロボットの自動制御方法
US20190351558A1 (en) Airport robot and operation method therefor
CN114521836A (zh) 一种自动清洁设备
EA039532B1 (ru) Автоматическое устройство для уборки и способ уборки
Peasley et al. Real-time obstacle detection and avoidance in the presence of specular surfaces using an active 3D sensor
CN110801180A (zh) 清洁机器人的运行方法及装置
US20220309761A1 (en) Target detection method, device, terminal device, and medium
Schauwecker et al. Robust and efficient volumetric occupancy mapping with an application to stereo vision
WO2020038155A1 (zh) 自主移动设备、控制方法及存储介质
US11741336B2 (en) Generating and/or using training instances that include previously captured robot vision data and drivability labels
CN115328153A (zh) 传感器数据处理方法、系统及可读存储介质
WO2021026965A1 (zh) 自主移动设备
US11797013B2 (en) Collision avoidance method and mobile machine using the same
CN110916562A (zh) 自主移动设备、控制方法及存储介质
CN114846509A (zh) 信息处理装置、信息处理方法和程序
US11460854B1 (en) System to determine floor or obstacle by autonomous mobile device
TWI824503B (zh) 自移動設備及其控制方法
CN210673216U (zh) 一种滤光式机器人

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19941507

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019941507

Country of ref document: EP

Effective date: 20220309