CN106826749B - Mobile robot - Google Patents

Mobile robot Download PDF

Info

Publication number
CN106826749B
CN106826749B CN201710045323.6A CN201710045323A CN106826749B CN 106826749 B CN106826749 B CN 106826749B CN 201710045323 A CN201710045323 A CN 201710045323A CN 106826749 B CN106826749 B CN 106826749B
Authority
CN
China
Prior art keywords
camera
robot
mobile robot
main body
main
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710045323.6A
Other languages
Chinese (zh)
Other versions
CN106826749A (en
Inventor
刘智成
宋章军
刘璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Flying Mouse Power Technology Co ltd
Original Assignee
Shenzhen Flying Mouse Power Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Flying Mouse Power Technology Co ltd filed Critical Shenzhen Flying Mouse Power Technology Co ltd
Priority to CN201710045323.6A priority Critical patent/CN106826749B/en
Publication of CN106826749A publication Critical patent/CN106826749A/en
Application granted granted Critical
Publication of CN106826749B publication Critical patent/CN106826749B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a mobile robot, which comprises a robot main body, a camera and a central processing unit, wherein the camera is used for acquiring images of surrounding environment for the mobile robot; the central processing unit is connected with the robot main body and the camera, and is used for processing the image data acquired by the camera, and controlling the traveling assembly of the robot main body to respond according to the processed image data, wherein the optical center of the camera is far away from the center of the robot main body, and the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body and the orthographic projection of a straight line passing through the center of the robot main body and a main point on an image plane of the camera on the upper surface form a first included angle larger than 0 degrees. The parallax of the camera is larger when the mobile robot turns, the direction change of the acquired image is large, and the positioning precision and the scene recognition precision can be improved. The invention also discloses another mobile robot.

Description

Mobile robot
Technical Field
The invention relates to the field of intelligent equipment, in particular to a mobile robot.
Background
The mobile robot is an intelligent device which can autonomously control movement and automatically execute work, can move on the ground or other surfaces, can accept the command of a user, can run a pre-programmed program, and can also act according to the schema of the principle formulated by the artificial intelligence technology. With the progress of scientific technology, mobile robots are being applied in fields such as production, military, service and the like, and particularly in the field of home service, cleaning robots such as dust collection, sweeping, mopping, glass wiping and the like, mowing robots for mowing and the like are being increasingly applied. At present, the method is applied to domestic and public places at home and abroad.
In order for a mobile robot to perform tasks better, it is generally desirable for the mobile robot to have positioning and navigation functions, so that many manufacturers add cameras and SLAM (Simultaneous Localization And Mapping) modules to the mobile robot to enable the mobile robot to automatically position and create a map for navigation purposes. However, in the prior art, the camera is arranged at the center of the mobile robot or at a position close to the center of the mobile robot, when the mobile robot turns, the parallax formed by three-dimensional space points in the environment in the camera is very small or even zero, so that the mobile robot cannot calculate the depth (distance) of the three-dimensional space points by using a triangulation method, positioning failure or scene recognition failure is caused when the mobile robot turns, map or navigation errors are caused, and the performance of the mobile robot is greatly reduced.
Disclosure of Invention
The invention aims to solve the technical problem of providing a mobile robot, wherein the parallax of a camera is larger when the mobile robot turns, the direction change of an acquired image is also large, the positioning precision and the scene recognition precision can be improved, and the situation that the depth of a three-dimensional space point cannot be calculated when the mobile robot turns is avoided, so that the positioning failure or the scene recognition failure caused by the situation is avoided.
In order to solve the technical problems, the invention adopts the following technical scheme:
in one aspect, an embodiment of the present invention provides a mobile robot including:
the robot main body is provided with a traveling assembly for driving the mobile robot to move on a traveling surface, and also provided with an upper surface and a lower surface which are opposite, wherein the lower surface faces the traveling surface when the mobile robot works normally;
the camera is arranged on the robot main body and used for collecting images of surrounding environment for the mobile robot;
the central processing unit is connected with the robot main body and the camera, and is used for processing the image data of the image acquired by the camera and controlling the driving assembly of the robot main body to respond according to the processed image data;
the optical center of the camera is far away from the center of the robot main body, and the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body forms a first included angle which is larger than 0 DEG with the orthographic projection of a straight line passing through the center of the robot main body and the main point of the camera on the upper surface.
Optionally, the robot body is substantially cylindrical or elliptic cylinder, and the orthographic projection of the main optical axis of the camera on the upper surface of the robot body forms the first included angle greater than 0 ° with the orthographic projection of the diameter, the long axis or the short axis of the main point of the robot body passing through the camera on the upper surface of the robot body.
Optionally, the camera is a fish eye camera.
Optionally, the camera is obliquely upwards installed on the robot main body, so that a second included angle larger than 0 DEG is formed between the main optical axis of the camera and the upper surface of the robot main body.
Optionally, the central processor includes a SLAM module for locating and creating a map for the mobile robot according to the processed image data.
In another aspect, an embodiment of the present invention further provides another mobile robot, including:
the robot main body is provided with a traveling assembly for driving the mobile robot to move on a traveling surface, and also provided with an upper surface and a lower surface which are opposite, wherein the lower surface faces the traveling surface when the mobile robot works normally;
the first camera and the second camera are arranged on the robot main body and are used for collecting images of surrounding environment for the mobile robot;
the central processing unit is connected with the robot main body, the first camera and the second camera, and is used for processing the image data acquired by the first camera and the second camera and controlling a driving assembly of the robot main body to respond according to the processed image data;
the optical centers of the first camera and the second camera are far away from the center of the robot main body, and the orthographic projection of the main optical axis of the first camera on the upper surface of the robot main body forms a third included angle which is larger than 0 DEG with the orthographic projection of a straight line passing through the center of the robot main body and the main point on the image plane of the first camera on the upper surface; and the orthographic projection of the main optical axis of the second camera on the upper surface of the robot main body forms a fourth included angle which is larger than 0 DEG with the orthographic projection of a straight line passing through the center of the robot main body and the main point on the image plane of the second camera on the upper surface.
Optionally, the robot body is substantially cylindrical or elliptic cylinder, and the orthographic projection of the main optical axis of the first camera on the upper surface of the robot body forms the third included angle greater than 0 ° with the orthographic projection of the diameter, the long axis or the short axis of the main point of the robot body passing through the first camera on the upper surface of the robot body;
and the orthographic projection of the main optical axis of the second camera on the upper surface of the robot main body forms the fourth included angle which is larger than 0 DEG with the orthographic projection of the diameter, the long axis or the short axis of the main point of the robot main body passing through the second camera on the upper surface of the robot main body.
Optionally, the first camera and the second camera are disposed at the front part of the robot main body and face different directions, and the first camera and/or the second camera are fish-eye cameras.
Optionally, the first camera and the second camera are obliquely installed on the robot main body, so that a fifth included angle larger than 0 ° is formed between the main optical axis of the first camera and the upper surface of the robot main body, and a sixth included angle larger than 0 ° is formed between the main optical axis of the second camera and the upper surface of the robot main body.
Optionally, the central processor includes a SLAM module for locating and creating a map for the mobile robot according to the processed image data.
Compared with the prior art, the technical scheme of the invention has at least the following beneficial effects:
in the embodiment of the invention, as the optical center of the camera of the mobile robot is far away from the center of the robot main body, the camera experiences larger displacement when the mobile robot turns, so that larger parallax is formed, and the depth (distance) of three-dimensional space points or barriers in the surrounding environment of the mobile robot can be calculated by using a triangulation method; and, because the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body forms a first included angle larger than 0 DEG with the orthographic projection of the straight line passing through the center of the robot main body and the main point of the camera on the upper surface of the robot main body, the parallax is further increased, so that the parallax is ensured to be large enough, the depth (distance) of three-dimensional space points or barriers in the surrounding environment of the mobile robot calculated by adopting a triangulation method is more accurate, and the positioning precision and the scene recognition capability are improved.
In addition, in one embodiment, the included angle between the main optical axis of the camera and the upper surface of the robot main body is larger than 0 degrees, so that the direction in which the camera collects images is changed greatly, more images of different scenes can be shot, and the second camera provides more information, so that the scene recognition success rate is improved greatly, and the performance of the mobile robot is improved greatly.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other variants can be obtained according to these drawings without the aid of inventive efforts to a person skilled in the art.
FIG. 1 is a schematic top view of a mobile robot in one embodiment of the invention; and
Fig. 2 is a schematic structural view of a mobile robot according to another embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Unless defined otherwise, technical or scientific terms used herein should be taken as the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The terms "first," "second," "third," and "fourth," etc. as used herein, are used for distinguishing between different objects and not for describing a particular sequential, quantitative, or importance. Likewise, the terms "a," "an," or "the" and similar terms do not denote a limitation of quantity, but rather are used to denote the presence of at least one. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a mobile robot according to an embodiment of the invention. In this embodiment, the mobile robot includes a robot main body 100, and a camera 200 and a central processing unit 300 disposed on the robot main body 100.
The robot main body 100 is provided with a driving assembly 110 for driving the mobile robot to move on a running surface, and the driving assembly 110 is connected with the central processing unit 300, so that the central processing unit 110 can control the driving assembly 110 according to the situation at any time. The travel module 110 may include wheels, a drive motor, and drive circuitry (not shown in fig. 1) through which the travel module 110 is communicatively coupled to the central processor 300. The robot body 100 also has opposite upper and lower surfaces that face the walking surface when the mobile robot is operating normally. The walking surface refers to a surface on which the mobile robot moves when operating normally, for example, when the cleaning robot cleans a floor, the floor to be cleaned is the walking surface, the lower surface of the cleaning robot refers to a surface facing the floor to be cleaned, and the surface of the cleaning robot opposite to the lower surface is the upper surface.
The camera 200 is disposed on the robot main body 100, and is configured to collect images of the surrounding environment of the mobile robot. The acquired image is converted by a processing unit (not shown) of the camera 200 to form image data (photoelectric signals) and transmitted to the cpu 300, and it is understood that the processing unit may be integrated into the cpu 300.
The central processing unit 300 is connected to the robot main body 100 and the camera 200, and is configured to process image data of an image acquired by the camera 200, and control a traveling component of the robot main body to respond according to the processed image data, for example, when the central processing unit 300 calculates that a three-dimensional space point or an obstacle is located in front in an environment according to the image data, the traveling component 110 of the robot main body 100 is controlled to perform operations such as steering, backing, increasing or reducing.
The optical center of the camera 200 is far away from the center C of the robot main body 100, so that when the mobile robot turns, the camera 200 is displaced greatly, thereby forming larger parallax, and the depth distance of the space points around the mobile robot can be calculated by using a triangulation method to position, so that positioning failure or positioning error caused by small parallax in the turning process can be avoided. Further, a first angle β1 formed by an orthographic projection of the main optical axis S of the camera 200 on the upper surface of the robot body 100 and an orthographic projection of a straight line passing through the center C of the robot body 100 and the main point O of the camera 200 on the upper surface is greater than 0 °. It can be understood that, when the main optical axis S of the camera 200 is at the same height or at the same horizontal plane as the center C of the robot body 100, the main optical axis S of the camera 200 forms a first included angle β1 greater than 0 ° with a straight line passing through the center C of the robot body 100 and the main point O of the camera 200, namely: the main optical axis S does not overlap with a straight line passing through the center C of the robot main body 100 and the main point O on the image plane of the camera 200, so that the main optical axis S of the camera 200 does not pass through the rotation center of the mobile robot, and thus the camera 200 captures pictures of three-dimensional space points or obstacles in the environment in different directions when the robot turns, thereby increasing the recognition degree, forming a large parallax and improving the positioning accuracy at the same time, and avoiding positioning failure or errors when the mobile robot turns.
The principal point of the camera is the principal point on the image plane on the sensing chip of the camera.
The optical center of the camera 200 may be far from the center C of the robot body 100, and the camera 200 may be disposed at or near the edge of the robot body 100. The camera 200 may be provided inside the robot body 100, or may be provided on the upper surface of the robot body 100, or the camera 200 may be provided on the lower surface of the robot body, as required. The camera 200 is preferably provided at a front portion of the robot body 100 near the edge.
The center C of the robot body 100 refers to the rotation center of the mobile robot, and it is understood that the rotation center is also generally the geometric center of the robot body 100, and thus, the center C of the robot body 100 may also be the geometric center of the robot body 100. For example, taking the robot body 100 as a substantially cylindrical example, the geometric center of the robot body 100 is the center of a circle of a disc, in this case, the orthographic projection of the main optical axis S of the camera 200 on the upper surface of the robot body forms the first included angle greater than 0 ° with the orthographic projection of the diameter (or radius) of the main point on the image plane of the robot body passing through the camera on the upper surface of the robot body 100. For another example, taking the robot body 100 as a substantially elliptical cylinder shape as an example, the front projection of the main optical axis S of the camera 200 on the upper surface of the robot body and the front projection of the major axis or the minor axis of the robot body passing through the main point of the camera on the upper surface of the robot body 100 form the first angle β1 larger than 0 °. The long axis and the short axis refer to the long axis and the short axis of the bottom surface or the cross section of the elliptical cylinder (the bottom surface and the cross section are substantially elliptical), and it should be noted that, although the robot body 100 shown in fig. 1 is cylindrical, this is only exemplary, and the robot body may have other shapes in the present embodiment, that is, the mobile robot in the present embodiment may have other shapes.
Further, the camera 200 is a fisheye camera, and the field angle of the fisheye camera is large, and may be 180 ° or more than 180 °, or may be less than 180 °. The field angle of the fish-eye camera is large, so that the scene recognition capability of the mobile robot can be further improved.
The camera 200 is mounted on the robot body in an oblique direction, so that a second included angle larger than 0 ° is formed between the main optical axis of the camera 100 and the upper surface of the robot body, and the oblique direction is the direction corresponding to the walking surface of the mobile robot when the mobile robot works normally.
The central processing unit comprises a SLAM (Simultaneous Localization And Mapping) module, and the SLAM module is used for positioning and creating a map for the mobile robot according to the processed image data, and any scheme in the prior art can be adopted by the SLAM module and is not repeated here.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a mobile robot according to another embodiment of the invention. The mobile robot in this embodiment includes a robot body 100, a first camera 210, a second camera 220, and a central processing unit 300. As shown in fig. 2, a front-rear axis F-R is defined by the front and rear of the robot body 100, and the direction in which F is directed is the front of the mobile robot, i.e., the direction in which the mobile robot advances.
The robot body 100 is provided with a traveling assembly 110 for driving the mobile robot to move on a traveling surface, and the traveling assembly 110 is connected with the central processing unit 300, so that the central processing unit 300 can control the traveling assembly 110 according to circumstances at any time. The travel module 110 may include wheels, a drive motor, and drive circuitry (not shown in fig. 1) through which the travel module 110 is communicatively coupled to the central processor 300. The robot body 100 also has opposite upper and lower surfaces that face the walking surface when the mobile robot is operating normally. The walking surface refers to a surface on which the mobile robot moves when operating normally, for example, when the cleaning robot cleans a floor, the floor to be cleaned is the walking surface, the lower surface of the cleaning robot refers to a surface facing the floor to be cleaned, and the surface of the cleaning robot opposite to the lower surface is the upper surface.
In one implementation manner of this embodiment, the first camera 210 and the second camera 220 are respectively located at two sides of the front-rear axis F-R, and the directions of the first camera 210 and the second camera 220 are different, and the main optical axis S1 of the first camera 210 and the main optical axis S2 of the second camera 220 form an included angle α. The first camera 210 and the second camera 220 are used for capturing images of the surrounding environment, that is, capturing images of three-dimensional points or obstacles around the mobile robot.
The central processing unit 300 is connected to the robot main body 100, the first camera 210 and the second camera 220, and is configured to process image data collected by the first camera 210 and the second camera 220, and control the driving assembly 110 of the robot main body 100 to respond according to the processed image data. For example, when the cpu 300 calculates that a three-dimensional space point or an obstacle is located in the front of the environment from the image data, it controls the traveling module 110 of the robot main body 100 to perform steering, reversing, speed increasing, speed reducing, or the like.
Wherein the first camera 210 or the second camera 220 is a fish-eye camera, it is understood that the first camera 210 and the second camera 220 may be configured to be fish-eye cameras, thereby greatly increasing the field angle of the mobile robot. The traditional mobile robot is only provided with one camera, is usually only used for navigation, does not use collected images for scene recognition, so that global repositioning is weaker (for example, when a floor sweeping machine is moved to another far place in the sweeping process, the whole map is required to be reestablished, and the memory of the place swept before is cleared), and the probability of successful global repositioning can be greatly improved by adopting double fish eyes for scene recognition. In the conventional mobile robot, if only a single top-view fisheye camera is adopted, the positioning accuracy is reduced due to a large field angle and a small image change during movement, and if a large-area moving object (such as a person or a pet passing near the mobile robot) appears in a picture at this time, the success rate of scene recognition is reduced. In this embodiment, two cameras, namely, a first camera 210 and a second camera 220, are installed on the robot main body 100 obliquely upwards, so that a fifth included angle larger than 0 ° is formed between the main optical axis of the first camera 210 and the upper surface of the robot main body 100, and a sixth included angle larger than 0 ° is formed between the main optical axis of the second camera 220 and the upper surface of the robot main body; the fifth included angle and the sixth included angle can be set to be equal or unequal in size according to the requirement; moreover, the directions of the first camera 210 and the second camera 220 are different, the view cones of the two fisheye cameras face different directions, the change of the images shot by the two cameras is large during the movement and other movements, which is favorable for improving the positioning accuracy, and the probability that the other two pictures are occupied by large-area moving objects is also greatly reduced, thereby improving the robustness of the system.
Further, the optical centers of the first camera 210 and the second camera 220 are both far away from the center C of the robot main body 100, so that when the mobile robot turns, the camera 200 is displaced greatly, thereby forming a larger parallax, and the depth distance of the space point of the surrounding environment of the mobile robot can be calculated by using a triangulation method, so as to accurately position, and avoid positioning failure or positioning error caused by small parallax in the turning process. And, the orthographic projection of the main optical axis S1 of the first camera 210 on the upper surface of the robot body 100 forms a third included angle β3 greater than 0 ° with the orthographic projection of a straight line passing through the center C of the robot body 100 and the main point O1 on the image plane of the first camera 210 on the upper surface; the front projection of the main optical axis S2 of the second camera 220 on the upper surface of the robot body 100 forms a fourth included angle β4 greater than 0 ° with the front projection of a straight line passing through the center C of the robot body and the main point O2 on the image plane of the second camera 220 on the upper surface. It should be understood that when the main optical axis S1 of the first camera 210 and/or the main optical axis S2 of the second camera 220 are at the same height or at the same level as the center C of the robot body 100, the main optical axis S1 of the first camera 210 forms a third angle β3 greater than 0 ° with a straight line passing through the center C of the robot body 100 and the main point O1 on the image plane of the first camera 210, and/or the main optical axis S2 of the second camera 210 forms a fourth angle β4 greater than 0 ° with a straight line passing through the center C of the robot body 100 and the main point O2 on the image plane of the second camera 220, that is: the main optical axes of the cameras do not overlap with the straight line passing through the center C of the robot main body 100 and the main point O1 or O2 of the cameras, so that the main optical axes S1 and S2 of the cameras 210 and 220 do not pass through the rotation center C of the mobile robot, and thus when the robot turns, the first camera 210 and the second camera 220 capture pictures of different directions of three-dimensional space points or obstacles in the environment, thereby greatly increasing the recognition degree, forming larger parallax and improving the positioning precision, avoiding positioning failure or errors when the mobile robot turns, and further improving the scene recognition capability.
The third included angle beta 3 and the fourth included angle beta 4 can be equal in size or unequal in size according to the requirement.
The optical centers of the first camera 210 and the second camera 220 may be far from the center C of the robot body 100, and the first camera 210 and the second camera 220 may be disposed at the edge or near the edge of the robot body 100. The first camera 210 and the second camera 220 may be disposed on two sides of the front-rear axis F-R, or may be disposed on the same side of the front-rear axis F-R as required; it should be understood that the first camera 210 and the second camera 220 may be disposed at the front and rear of the robot body 100, respectively, at the front of the robot body 100, or at the rear of the robot body 100, respectively, as desired. Preferably, the first camera 210 and the second camera 220 are respectively disposed at two sides of the front-rear axis F-R. The first camera 210 and the second camera 220 may be provided inside the robot body 100, may be provided on the upper surface of the robot body 100, and may be provided on the lower surface of the robot body 100 as required. The camera 200 is preferably provided at a front portion of the robot body 100 near the edge.
The center C of the robot body 100 refers to the rotation center of the mobile robot, and it is understood that the rotation center is also generally the geometric center of the robot body 100, and thus, the center C of the robot body 100 can also be considered as the geometric center of the robot body 100.
Wherein when the robot body 100 is substantially cylindrical, the orthographic projection of the main optical axis S1 of the first camera 210 on the upper surface of the robot body 100 forms the third included angle β3 greater than 0 ° with the orthographic projection of the diameter of the main point O1 of the robot body 100 on the image plane passing through the first camera 210 on the upper surface of the robot body; the front projection of the main optical axis S2 of the second camera 220 on the upper surface of the robot body 100 forms the fourth included angle β4 greater than 0 ° with the front projection of the diameter of the main point O2 of the robot body 100 on the image plane passing through the second camera 220 on the upper surface of the robot body.
When the robot body 100 is substantially in an elliptic cylinder, the orthographic projection of the main optical axis S1 of the first camera 210 on the upper surface of the robot body 100 forms the third included angle β3 greater than 0 ° with the orthographic projection of the major axis or the minor axis of the main point O1 of the robot body 100 on the upper surface of the robot body passing through the image plane of the first camera 210; the front projection of the main optical axis S2 of the second camera 220 on the upper surface of the robot body 100 forms the fourth included angle β4 greater than 0 ° with the front projection of the major axis or the minor axis of the main point O2 of the robot body 100 on the upper surface of the robot body passing through the image plane of the second camera 220.
It should be noted that, although the robot body 100 shown in fig. 2 is cylindrical, this is merely exemplary, and the robot body in this embodiment may have other shapes, that is, the mobile robot in this embodiment may have other shapes.
In this embodiment, since the optical center of the camera of the mobile robot is far away from the center of the robot main body, when the mobile robot turns, the camera undergoes larger displacement, so that larger parallax is formed, and the depth (distance) of the three-dimensional space point or the obstacle in the surrounding environment of the mobile robot can be calculated by using a triangulation method; and, because the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body forms a first included angle larger than 0 DEG with the orthographic projection of the straight line passing through the center of the robot main body and the main point of the camera on the upper surface of the robot main body, the parallax is further increased, so that the parallax is ensured to be large enough, the depth (distance) of three-dimensional space points or barriers in the surrounding environment of the mobile robot calculated by adopting a triangulation method is more accurate, and the positioning precision and the scene recognition capability are improved.
In addition, the main optical axis of the camera and the upper surface of the robot main body form an included angle larger than 0 degrees, so that the direction in which the camera collects images is changed greatly, more images of different scenes can be shot, and the second camera provides more information, so that the scene recognition success rate is improved greatly, and the performance of the mobile robot is improved greatly.
It should be appreciated that the central processor 300 also includes a SLAM module for locating and creating a map for the mobile robot based on the processed image data.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above-described embodiments do not limit the scope of the present invention. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the above embodiments should be included in the scope of the present invention.

Claims (10)

1. A mobile robot, comprising:
the robot main body is provided with a traveling assembly for driving the mobile robot to move on a traveling surface, and also provided with an upper surface and a lower surface which are opposite, wherein the lower surface faces the traveling surface when the mobile robot works normally; the camera is arranged on the robot main body and used for collecting images of surrounding environment for the mobile robot; the central processing unit is connected with the robot main body and the camera, and is used for processing the image data of the image acquired by the camera and controlling the driving assembly of the robot main body to respond according to the processed image data; the center of the camera is far away from the center of the robot main body, and the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body forms a first included angle which is larger than 0 DEG with the orthographic projection of the straight line passing through the center of the robot main body and the main point of the camera on the upper surface, wherein the orthographic projection of the main optical axis of the camera on the upper surface of the robot main body and the advancing direction of the mobile robot are arranged in an included angle.
2. The mobile robot of claim 1, wherein the robot body is generally cylindrical or elliptical, and wherein an orthographic projection of a main optical axis of the camera on an upper surface of the robot body forms the first included angle of greater than 0 ° with an orthographic projection of a diameter, a major axis, or a minor axis of the robot body passing through a main point of the camera on the upper surface.
3. The mobile robot of claim 1 or 2, wherein the camera is a fish-eye camera.
4. The mobile robot of claim 1 or 2, wherein the camera is mounted obliquely upward on the robot body such that a main optical axis of the camera forms a second included angle greater than 0 with an upper surface of the robot body.
5. The mobile robot of claim 1 or 2, wherein the central processor comprises a SLAM module for locating and creating a map for the mobile robot from the processed image data.
6. The mobile robot is characterized by comprising a robot main body, a driving component and a driving component, wherein the driving component is used for driving the mobile robot to move on a walking surface; the first camera and the second camera are arranged on the robot main body and are used for collecting images of surrounding environment for the mobile robot; the central processing unit is connected with the robot main body, the first camera and the second camera, and is used for processing the image data acquired by the first camera and the second camera and controlling a driving assembly of the robot main body to respond according to the processed image data; the front projection of the main optical axis of the first camera on the upper surface of the robot body and the straight line passing through the center of the robot body and the main point of the first camera form a third included angle larger than 0, the front projection of the main optical axis of the first camera on the upper surface of the robot body and the front projection of the straight line passing through the center of the robot body and the main point of the second camera form a fourth included angle larger than 0, and the front projection of the main optical axis of the first camera on the upper surface of the robot body and the advancing direction of the mobile robot are arranged.
7. The mobile robot of claim 6, wherein the robot body is generally cylindrical or elliptical, an orthographic projection of a main optical axis of the first camera on an upper surface of the robot body and an orthographic projection of a diameter, a major axis, or a minor axis of the robot body passing through a main point of the first camera on the upper surface form the third included angle greater than 0 °; and the orthographic projection of the main optical axis of the second camera on the upper surface of the robot main body forms the fourth included angle which is larger than 0 DEG with the orthographic projection of the diameter, the long axis or the short axis of the main point of the robot main body passing through the second camera on the upper surface.
8. Mobile robot according to claim 6 or 7, characterized in that the first camera and the second camera are arranged in front of the robot body and are directed in different directions, the first camera and/or the second camera being a fisheye camera.
9. The mobile robot of claim 6 or 7, wherein the first camera and the second camera are mounted on the robot body obliquely upward such that a main optical axis of the first camera forms a fifth included angle greater than 0 with an upper surface of the robot body, and a main optical axis of the second camera forms a sixth included angle greater than 0 with the upper surface of the robot body.
10. The mobile robot of claim 6 or 7, wherein the central processor includes a SLAM module for locating and creating a map for the mobile robot based on the processed image data.
CN201710045323.6A 2017-01-22 2017-01-22 Mobile robot Active CN106826749B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710045323.6A CN106826749B (en) 2017-01-22 2017-01-22 Mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710045323.6A CN106826749B (en) 2017-01-22 2017-01-22 Mobile robot

Publications (2)

Publication Number Publication Date
CN106826749A CN106826749A (en) 2017-06-13
CN106826749B true CN106826749B (en) 2024-01-26

Family

ID=59120195

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710045323.6A Active CN106826749B (en) 2017-01-22 2017-01-22 Mobile robot

Country Status (1)

Country Link
CN (1) CN106826749B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506181A (en) * 2017-12-15 2021-03-16 珊口(上海)智能科技有限公司 Mobile robot and control method and control system thereof
CN210704858U (en) * 2018-09-28 2020-06-09 成都家有为力机器人技术有限公司 Cleaning robot with binocular camera
KR20220101140A (en) 2019-11-12 2022-07-19 넥스트브이피유 (상하이) 코포레이트 리미티드 mobile robot
CN210998737U (en) * 2019-11-12 2020-07-14 上海肇观电子科技有限公司 Mobile robot
ES2939332A1 (en) * 2023-02-03 2023-04-20 Ostirion S L U PROCEDURE AND CONTROL EQUIPMENT FOR MOBILE ROBOTS WITHOUT A COMPUTER OR SENSORS ON BOARD (Machine-translation by Google Translate, not legally binding)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006134218A (en) * 2004-11-09 2006-05-25 Funai Electric Co Ltd Cleaning robot with intruder repelling function
CN104079916A (en) * 2014-06-16 2014-10-01 深圳市德赛微电子技术有限公司 Panoramic three-dimensional visual sensor and using method
CN204618113U (en) * 2015-04-30 2015-09-09 李东梅 Monitoring clean robot
JP2016120168A (en) * 2014-12-25 2016-07-07 株式会社東芝 Vacuum cleaner
CN206484535U (en) * 2017-01-22 2017-09-12 深圳悉罗机器人有限公司 Mobile robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006134218A (en) * 2004-11-09 2006-05-25 Funai Electric Co Ltd Cleaning robot with intruder repelling function
CN104079916A (en) * 2014-06-16 2014-10-01 深圳市德赛微电子技术有限公司 Panoramic three-dimensional visual sensor and using method
JP2016120168A (en) * 2014-12-25 2016-07-07 株式会社東芝 Vacuum cleaner
CN204618113U (en) * 2015-04-30 2015-09-09 李东梅 Monitoring clean robot
CN206484535U (en) * 2017-01-22 2017-09-12 深圳悉罗机器人有限公司 Mobile robot

Also Published As

Publication number Publication date
CN106826749A (en) 2017-06-13

Similar Documents

Publication Publication Date Title
CN106826749B (en) Mobile robot
JP7484015B2 (en) Obstacle detection method and device, self-propelled robot, and storage medium
CN107907131B (en) positioning system, method and applicable robot
US10133278B2 (en) Apparatus of controlling movement of mobile robot mounted with wide angle camera and method thereof
CA2870175C (en) Carpet drift estimation using differential sensors or visual measurements
US9329598B2 (en) Simultaneous localization and mapping for a mobile robot
KR101776622B1 (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
US20230247015A1 (en) Pixelwise Filterable Depth Maps for Robots
KR102648771B1 (en) Autonomous map traversal with waypoint matching
US20110046784A1 (en) Asymmetric stereo vision system
EP2296072A2 (en) Asymmetric stereo vision system
KR20130037056A (en) Snake type reconnaissance exploration robot and operation method thereof
EP3850581A1 (en) Systems and methods for vslam scale estimation using optical flow sensor on a robotic device
Rusdinar et al. Implementation of real-time positioning system using extended Kalman filter and artificial landmark on ceiling
CN108481327A (en) A kind of positioning device, localization method and the robot of enhancing vision
CN111220148A (en) Mobile robot positioning method, system and device and mobile robot
JP2010288112A (en) Self-position estimation device, and self-position estimation method
CN112034837A (en) Method for determining working environment of mobile robot, control system and storage medium
CN106647755A (en) Sweeping robot capable of intelligently building sweeping map in real time
CN110587602B (en) Fish tank cleaning robot motion control device and control method based on three-dimensional vision
Huang A Semi-Autonomous Vision-Based Navigation System for a Mobile Robotic Vehicle
CN113017492A (en) Object recognition intelligent control system based on cleaning robot
Wirbel et al. Humanoid robot navigation: From a visual SLAM to a visual compass
Fujita 3D sensing and mapping for a tracked mobile robot with a movable laser ranger finder
CN115718487A (en) Self-moving equipment pose determining method and device, self-moving equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 518110, Longhua New District incubator building, 1301 sightseeing Road, Guanlan District, Longhua New District, Guangdong, Shenzhen province C901

Applicant after: Shenzhen flying mouse Power Technology Co.,Ltd.

Address before: 518110, Longhua New District incubator building, 1301 sightseeing Road, Guanlan District, Longhua New District, Guangdong, Shenzhen province C901

Applicant before: Shenzhen Xiluo Robot Co.,Ltd.

GR01 Patent grant
GR01 Patent grant