CN113805591A - Visual system and method for synchronous positioning and mapping and mobile robot - Google Patents

Visual system and method for synchronous positioning and mapping and mobile robot Download PDF

Info

Publication number
CN113805591A
CN113805591A CN202111117088.1A CN202111117088A CN113805591A CN 113805591 A CN113805591 A CN 113805591A CN 202111117088 A CN202111117088 A CN 202111117088A CN 113805591 A CN113805591 A CN 113805591A
Authority
CN
China
Prior art keywords
camera
angle
mapping
positioning
reflecting device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111117088.1A
Other languages
Chinese (zh)
Inventor
郭先清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Congsi Microelectronics Technology Co ltd
Original Assignee
Shenzhen Congsi Microelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Congsi Microelectronics Technology Co ltd filed Critical Shenzhen Congsi Microelectronics Technology Co ltd
Priority to CN202111117088.1A priority Critical patent/CN113805591A/en
Publication of CN113805591A publication Critical patent/CN113805591A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a vision system, a method and a mobile robot for synchronous positioning and mapping, wherein a camera and a reflecting device are arranged, the reflecting device is positioned above the camera, and the camera receives visual information of an external environment through the reflecting device to form an image; adjusting the inclination angle of the reflecting device to obtain images in different viewing angle ranges; the localization or modeling is based on feature information in the image. The optical axis of the camera in the optical system of the invention does not face the advancing direction of the mobile robot, but forms an obtuse angle or a right angle with the advancing direction of the machine body relative to the optical axis of the camera on the top plane of the machine body. The optical system is arranged at a position close to the structure shell in the mobile robot, does not protrude out of the top plane of the robot body, does not need to be provided with a sinking structure, and is flexible to install. By adjusting the inclination angles of the camera and the reflecting device, the optical system can obtain a larger field angle, and the field angle can be adjusted. Binocular vision is obtained by using one camera, and binocular positioning is realized.

Description

Visual system and method for synchronous positioning and mapping and mobile robot
Technical Field
The invention relates to the technical field of machine vision, in particular to a vision system and a method for synchronously positioning and establishing a diagram and a mobile robot.
Background
In order to better collect effective environmental characteristic data, in some applications, such as a sweeping robot, a visual system of a synchronous positioning and mapping (SLAM) of the robot collects environmental information with the height of 0.8 m to 2.5 m in a room, extracts key characteristics, and then uses the key characteristics as a characteristic marker of an indoor space, an auxiliary system searches and compares the characteristic markers to judge the specific direction of the robot in the room, and path planning navigation is completed.
In the existing sweeping robot, a vision module is a sinking structure which is lower than the top plane of a robot body and is positioned close to the geometric center of the robot body. The camera is arranged in the sinking structure and is obliquely installed, and the optical axis of the camera is aligned with the advancing direction of the robot and points to the horizontal plane, and forms an acute angle of 30-40 degrees with the top plane of the robot body. The angle of view of the camera in the vertical direction is 45-65 degrees. Because the design is built in and close to the geometric center of the machine body and is a sinking structure, the visual field range is greatly influenced by the machine body, and particularly the range below the top plane of the machine body cannot be detected.
Disclosure of Invention
Aiming at the existing limitation of the design of a vision system, the invention provides the vision system, the method and the mobile robot for synchronously positioning and establishing the drawing, the installation position and the angle of a camera can be flexibly adjusted, and a larger visual field range can be covered; the optical system can be widely applied to the mobile robot and flexibly installed and implemented.
In order to achieve the above object, the present invention provides a vision system for synchronous positioning and mapping, comprising a reflection device, a camera, a first control mechanism and a processor;
the camera receives visual information of an external environment through the reflecting device to form an image;
the processor controls the first control mechanism to adjust the angle of the reflecting device, so that the camera can acquire images in different visual angle ranges; the processor obtains images of different view angle ranges for positioning or modeling.
Further, the device also comprises a second control mechanism which can adjust the angle of the camera.
Furthermore, an included angle theta formed by the optical axis of the camera and the advancing direction of the machine body is larger than or equal to 90 degrees and smaller than or equal to 180 degrees, and the reflecting device is positioned above the camera and reflects the visual information of the external environment in the advancing direction to the camera.
Further, the first control mechanism drives the reflecting device to adjust the inclination angle at a step interval of 0.1-5 degrees.
Furthermore, the included angle between the central visual axis of the synchronous positioning and mapping visual system and the horizontal plane is 25-45 degrees, and the angle of field in the vertical direction is 40-70 degrees.
The second aspect provides a mobile robot, and the vision system for synchronous positioning and mapping is adopted.
Furthermore, the vision system for synchronous positioning and mapping is installed at the front end of the mobile robot, and the external environment is obtained through a window on the front end face of the mobile robot. Further, the external environment includes environmental information at an altitude of 0.8 to 2.5 meters.
A third aspect provides a method for synchronously positioning and mapping, including:
arranging a camera and a reflecting device, wherein the reflecting device is positioned above the camera, and the camera receives visual information of an external environment through the reflecting device to form an image;
adjusting the installation angle of the camera and the inclination angle of the reflecting device to enable the included angle between the central visual axis of the synchronous positioning and mapping visual system and the horizontal plane to be 25-45 degrees, and the angle of field in the vertical direction to be 40-70 degrees;
adjusting the inclination angle of the reflecting device to obtain images in different viewing angle ranges;
the localization or modeling is based on feature information in the image.
Furthermore, the collected visual information of the external environment has insufficient effective characteristic information amount to realize positioning or mapping, the angle of the reflecting device is adjusted, images in different visual angle ranges are obtained, and positioning or mapping is carried out again. Furthermore, an included angle theta formed by the optical axis of the camera and the advancing direction of the machine body is adjusted, wherein theta is more than or equal to 90 degrees and less than or equal to 180 degrees.
Further, the localization or modeling is performed based on feature information in the image, including: and acquiring an image corresponding to the first inclination angle of the reflecting device and an image corresponding to the second inclination angle of the reflecting device, carrying out binocular positioning, and determining the distance between the target object and the image.
Further, taking the coordinates (X, Z) of the object P in the camera coordinate system as:
Figure BDA0003275881450000031
Figure BDA0003275881450000032
where α is the rotation angle, f is the focal length of the camera lens, d 1' (x)1’,z1') is the imaging position of the object P in the virtual image C1' after the camera is rotated; d1The distance from the object P to the optical axis of the virtual image C1' is shown as the foot point q1,t1As virtual images C1' to q1The distance of (c).
Further, D1、t1Obtained by solving the following equation:
Figure BDA0003275881450000033
Figure BDA0003275881450000034
where r is the distance from the center of rotation O to the camera.
The technical scheme of the invention has the following beneficial technical effects:
(1) the optical axis of the camera in the optical system of the invention does not face the advancing direction of the mobile robot, but forms an obtuse angle or a right angle with the advancing direction of the machine body relative to the optical axis of the camera on the top plane of the machine body. The camera collects visual information of the external environment through a reflecting device (such as a reflector and a prism) in the optical system.
(2) The optical system is arranged in the movable robot at a position close to the structural shell, does not protrude out of the top plane of the robot body, does not need to be provided with a sinking structure, and is flexible to install.
(3) By adjusting the inclination angles of the camera and the reflecting device, the optical system can observe the surrounding environment by taking an angle 25-45 degrees higher than the horizontal plane as a central angle relative to the advancing direction of the sweeping robot and a visual field angle 40-70 degrees in the vertical direction, so that visual information is collected, the visual field angle is larger, and the visual field can be adjusted.
(4) The inclination angle of the reflecting device is adjusted by the angle control mechanism according to the algorithm requirement, the adjusting interval is 0.1-5 degrees, and the invention directly adjusts the optical system, thereby enlarging the visual range, realizing the binocular visual effect of the monocular camera and observing and positioning the surrounding environment.
Drawings
FIG. 1 is a schematic view of an imaging angle and a field angle of a vision system at a first angle;
FIG. 2 is a schematic view of an imaging angle and a field angle of a second angle of the vision system;
FIG. 3 is a schematic view of a vision system with viewing angles and field angles;
FIG. 4 is a schematic of a vision system installation location;
FIG. 5 is a flow chart of the use of the vision system for simultaneous positioning and mapping;
FIG. 6 is a schematic view of binocular vision;
fig. 7 is a schematic diagram of binocular positioning settlement.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings in conjunction with the following detailed description. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
The invention provides a visual system for synchronous positioning and drawing, which is combined with a drawing 1 and comprises a reflecting device, a camera, a first control mechanism, a second control mechanism and a processor.
The camera receives visual information of an external environment through the reflecting device to form an image.
And the second control mechanism can adjust the angle of the camera. The second control mechanism is used for adjusting the angle of the camera in the mounting process, so that the vertical field angle beta of the camera is 40-70 degrees.
The first control mechanism adjusts the angle of the reflecting device.
The processor adjusts the angle of the reflecting device through the first control mechanism to obtain images in different view angle ranges; the localization or modeling is based on feature information in the image.
The camera comprises a camera body, a reflecting device and a camera body, wherein an included angle theta formed by an optical axis of the camera body and the advancing direction of the camera body is larger than or equal to 90 degrees and smaller than or equal to 180 degrees, the reflecting device is positioned above the camera body and reflects visual information of an external environment in the advancing direction to the camera body.
Further, the first control mechanism drives the reflecting device to adjust the inclination angle at a step interval of 0.1-5 degrees.
The camera is not oriented to the forward direction, but obtains environmental information of the forward direction through a reflecting device.
α: the central visual angle of the visual system is in the same direction with the advancing direction of the machine body, points above the horizontal plane and forms an included angle of 25-45 degrees with the horizontal plane.
Beta: the vertical field angle of the camera is 40-70 degrees.
γ: the included angle between the optical axis of the camera and the advancing direction of the machine body.
θ: the light reflecting device forms an included angle with the horizontal plane along the advancing direction of the machine body.
In one embodiment, as shown in fig. 1, the camera is mounted vertically, γ is at right angle, and the optical axis of the camera is perpendicular to the horizontal plane with respect to the forward direction of the body (or the top plane of the body).
The camera can be installed vertically or obliquely. In another embodiment, as shown in fig. 2, the optical axis of the camera is directed opposite to the direction of advance of the body, and the optical axis of the camera forms an obtuse angle of less than 180 ° with the horizontal plane (or the plane of the roof) with respect to the direction of advance of the body. The environmental information of the advancing direction can be obtained by adjusting the reflection angle of the reflection device. Therefore, the installation angle of the camera can be adjusted according to actual installation requirements. In the obtuse angle range, the camera can be installed at any inclination angle. Meanwhile, the angle of the light reflecting device is adjusted correspondingly, and the angle theta is increased.
In summary, the camera and the light reflection device can be installed and adjusted according to the included angle between the optical axis of the camera and the advancing direction of the body, which is between 90 ° (including 90 °) and 180 ° (including 180 °), so as to obtain the same visual effect as the prior art, and the inclination angle of the reflection device can be adjusted to obtain a larger range of external environment visual information.
When the external information collected by the vision system is processed by the algorithm, the effective characteristic information amount is not enough to make a positioning or mapping decision by the robot system, for example, the collected pictures are all white walls, an angle control mechanism of a reflecting device, such as a motor and the like, can be started, and the alpha angle is increased or decreased at a step interval of 0.1-5 degrees, so that the environment characteristic information is collected by covering a larger vision range, and the system can make the positioning or mapping decision more effectively.
Further, binocular positioning can also be realized through images at different angles.
Further, an event-driven camera (also called a dynamic image sensor) is adopted in one embodiment to directly acquire environment characteristic values for synchronous positioning, mapping and obstacle avoidance of the mobile robot. And controlling and adjusting the viewing angle of the vision module based on the environmental characteristics acquired from the event driven camera.
In another aspect, a mobile robot is provided, and the vision system for synchronous positioning and mapping is adopted. As shown in fig. 3-4, the vision system for synchronous positioning and mapping is installed at the front end of the mobile robot, and the external environment is obtained through a window on the front end surface of the mobile robot. Alpha is a central visual angle, and the included angle between the alpha and the horizontal plane is 25-45 degrees, and the alpha can be adjusted by the processor through the control mechanism according to the algorithm requirement in the range. Beta is the angle of view of 40-70 degrees in the vertical direction. Further, the external environment includes environmental information at an altitude of 0.8 to 2.5 meters. For example, an image of the wall with a height of 0.8-2.5 m is acquired, and environmental characteristic information is acquired.
When the mobile robot system is started, the vision system is started and initialized, and the reflecting device control mechanism adjusts the reflecting device to a preset angle, so that the system visual angle is a certain value within the range of 25-45 degrees. And then, the vision system collects data in the moving process of the robot, and the SLAM algorithm is used for positioning, drawing and obstacle avoidance decision after the data is processed by the vision algorithm. When the external information collected by the vision system is processed by an algorithm, the effective characteristic information is not enough to enable the robot system to make a decision for positioning, drawing or obstacle avoidance (for example, the collected picture is a white wall), the robot system stops moving, an angle control mechanism of a reflecting device, such as a motor and the like, is started, and an alpha angle adjusts the angle of the reflecting device at a step interval of 0.1-5 degrees, so that the central visual angle alpha of the vision system swings up and down in a range of 25-45 degrees, the vision information is collected, a binocular vision effect is formed, and the positioning of the mobile robot is completed. At the same time, the vision system can cover a larger surrounding environment because of the up and down movement of the central viewing angle.
The invention provides a method for synchronously positioning and establishing a map, which comprises the following steps in combination with a map 5:
the camera receives visual information of an external environment through the reflecting device to form an image. The reflecting device is positioned above the camera and reflects the visual information of the external environment in the advancing direction to the camera.
The installation angle of the camera and the inclination angle of the reflecting device are adjusted, so that the included angle between the central visual axis of the vision system for synchronous positioning and drawing and the horizontal plane is 25-45 degrees, and the angle of field in the vertical direction is 40-70 degrees. Furthermore, an included angle theta formed by the optical axis of the camera and the advancing direction of the machine body is adjusted, wherein theta is more than or equal to 90 degrees and less than or equal to 180 degrees.
And adjusting the inclination angle of the reflecting device to obtain images in different viewing angle ranges.
The localization or modeling is based on feature information in the image.
Furthermore, the collected visual information of the external environment has insufficient effective characteristic information amount to realize positioning or mapping, the angle of the reflecting device is adjusted, images in different visual angle ranges are obtained, the characteristic information is extracted, and the SLAM algorithm is used for positioning, mapping or obstacle avoidance.
Further, the localization or modeling is performed based on feature information in the image, including: and acquiring an image corresponding to the first inclination angle of the reflecting device and an image corresponding to the second inclination angle of the reflecting device, carrying out binocular positioning, and determining the distance between the reflecting device and the target object.
FIG. 6 is a schematic view of binocular vision, C1’,C2' is a virtual image formed by the camera C before and after the reflector rotates, O is the rotation center of the reflector, and alpha is the rotation angle. With reference to fig. 7, where P is the object actually observed, OC 1 'OC 2' is 2 α, OP is the distance from the object P to the rotation center O of the mirror, r is the distance from the rotation center O to the camera, and r ═ C1′O=C2' O, f is the focal length of the lens of the camera, namely C1’b1,C1′q1Is a virtual image C of the camera1' optical axis, C2′q2Is a virtual image C of the camera2Optical axis of `, d1’,d2' is the object P in the camera virtual image C1’、C2' position on imaging plane. d1’,d2' the intersection points of the optical axes of C1 ' and C2 ' and the focal plane are taken as coordinate origins, and the coordinates are d1’(x1’,y1’),d2’(x2’,y2’)。d1From to C1Distance of optical axis of1=|x1’|,d2From to C2Distance of optical axis of2=|x2’|。
C1′q1=t1,Pq1=D1,C2′q2=t2,Pq2=D2Solving for t1,t2,D1,D2The P point coordinate is known.
By virtual image C1’、C2The straight line is the X axis, C1From to C2The direction of' is the positive X-axis direction, to cross C1' A Z axis is constructed perpendicular to the X axis, and the direction to the object P is positive, so that an XZ coordinate system is constructed.
T is a reference line baseline of the general formula,T=2*r*sinα,d1' the coordinates in the XZ coordinate system are: d1(x1,z1),d2' the coordinates in the XZ coordinate system are: d2(x2,z2) The calculation process is as follows:
(1)x1=f*sinα-x1’*cosα
(2)z1=f*cosα+z1’*sinα
(3)x2=T-(f*sinα+x2’*cosα)=2*r*sinα-f*sinα-x2’*cosα
(4)z2=f*cosα-x2’*sinα
(5)
Figure BDA0003275881450000081
(6)
Figure BDA0003275881450000082
(7)
Figure BDA0003275881450000083
(8)
Figure BDA0003275881450000084
(9)L1=|x1′|
(10)L2=|x2′|
a system of equations is obtained:
Figure BDA0003275881450000085
the system of equations is collated to obtain:
Figure BDA0003275881450000086
solving for t1、t2And then solving to obtain D1、D2. The P point coordinates (X, Z) are:
X1/X=Z1/Z=L1/D1
Figure BDA0003275881450000087
Figure BDA0003275881450000088
the invention relates to a vision system, a method and a mobile robot for synchronous positioning and mapping, wherein a camera and a reflecting device are arranged, the reflecting device is positioned above the camera, and the camera receives visual information of an external environment through the reflecting device to form an image; adjusting the inclination angle of the reflecting device to obtain images in different viewing angle ranges; the localization or modeling is based on feature information in the image. The optical axis of the camera in the optical system of the invention does not face the advancing direction of the mobile robot, but forms an obtuse angle or a right angle with the advancing direction of the machine body relative to the optical axis of the camera on the top plane of the machine body. The optical system is arranged at a position close to the structure shell in the mobile robot, does not protrude out of the top plane of the robot body, does not need to be provided with a sinking structure, and is flexible to install. By adjusting the inclination angles of the camera and the reflecting device, the optical system can obtain a larger field angle, and the field angle can be adjusted. Binocular vision is obtained by using one camera, and binocular positioning is realized.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modification, equivalent replacement, improvement and the like made without departing from the spirit and scope of the present invention should be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (10)

1. A visual system for synchronous positioning and mapping is characterized by comprising a reflecting device, a camera, a first control mechanism and a processor;
the camera receives visual information of an external environment through the reflecting device to form an image;
the processor controls the first control mechanism to adjust the angle of the reflecting device, so that the camera can acquire images in different visual angle ranges; the processor obtains images of different view angle ranges for positioning or modeling.
2. The vision system for synchronized positioning and mapping of claim 1, further comprising a second control mechanism capable of adjusting an angle of said camera.
3. The visual system for synchronously positioning and mapping as claimed in claim 1 or 2, wherein an included angle θ formed by an optical axis of said camera and a forward direction of the machine body is larger than or equal to 90 ° and smaller than or equal to 180 °, and said reflection device is located above said camera and reflects visual information of an external environment in the forward direction to the camera.
4. The vision system for synchronous positioning and mapping of claim 3, wherein said first control mechanism drives said reflection device to adjust the tilt angle at stepped intervals of 0.1 ° to 5 °.
5. The vision system for synchronous positioning and mapping as claimed in claim 1 or 2, wherein the angle between the central visual axis of the vision system for synchronous positioning and mapping and the horizontal plane is 25 ° to 45 °, and the angle of view in the vertical direction is 40 ° to 70 °.
6. A mobile robot, characterized in that a vision system for simultaneous localization and mapping as claimed in any one of claims 1 to 5 is used.
7. The mobile robot of claim 6, wherein the vision system for synchronized positioning and mapping is mounted on the front end of the mobile robot, and the external environment is captured by a window on the front end of the mobile robot. Further, the external environment includes environmental information at an altitude of 0.8 to 2.5 meters.
8. A method for synchronously positioning and mapping is characterized by comprising the following steps:
arranging a camera and a reflecting device, wherein the reflecting device is positioned above the camera, and the camera receives visual information of an external environment through the reflecting device to form an image;
adjusting the installation angle of the camera and the inclination angle of the reflecting device to enable the included angle between the central visual axis of the synchronous positioning and mapping visual system and the horizontal plane to be 25-45 degrees, and the angle of field in the vertical direction to be 40-70 degrees;
adjusting the inclination angle of the reflecting device to obtain images in different viewing angle ranges;
the localization or modeling is based on feature information in the image.
9. The method of synchronized positioning and mapping of claim 8, wherein: and the collected visual information of the external environment has insufficient effective characteristic information amount to realize positioning or mapping, the angle of the reflecting device is adjusted, images in different visual angle ranges are obtained, and positioning or mapping is carried out again. Furthermore, an included angle theta formed by the optical axis of the camera and the advancing direction of the machine body is adjusted, wherein theta is more than or equal to 90 degrees and less than or equal to 180 degrees.
10. The method for synchronously positioning and mapping as claimed in claim 8, wherein the positioning or modeling based on feature information in the image comprises: and acquiring an image corresponding to the first inclination angle of the reflecting device and an image corresponding to the second inclination angle of the reflecting device, carrying out binocular positioning, and determining the distance between the target object and the image.
Further, taking the coordinates (X, Z) of the object P in the camera coordinate system as:
Figure FDA0003275881440000021
Figure FDA0003275881440000022
where α is the rotation angle, f is the focal length of the camera lens, d 1' (x)1’,z1') is the imaging position of the object P in the virtual image C1' after the camera is rotated; d1The distance from the object P to the optical axis of the virtual image C1' is shown as the foot point q1,t1As virtual images C1' to q1The distance of (c).
Further, D1、t1Obtained by solving the following equation:
Figure FDA0003275881440000023
Figure FDA0003275881440000024
where r is the distance from the center of rotation O to the camera.
CN202111117088.1A 2021-09-23 2021-09-23 Visual system and method for synchronous positioning and mapping and mobile robot Pending CN113805591A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111117088.1A CN113805591A (en) 2021-09-23 2021-09-23 Visual system and method for synchronous positioning and mapping and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111117088.1A CN113805591A (en) 2021-09-23 2021-09-23 Visual system and method for synchronous positioning and mapping and mobile robot

Publications (1)

Publication Number Publication Date
CN113805591A true CN113805591A (en) 2021-12-17

Family

ID=78940209

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111117088.1A Pending CN113805591A (en) 2021-09-23 2021-09-23 Visual system and method for synchronous positioning and mapping and mobile robot

Country Status (1)

Country Link
CN (1) CN113805591A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07322104A (en) * 1994-05-24 1995-12-08 A S I Kk Monitoring device
JPH08194809A (en) * 1995-01-20 1996-07-30 Nippon Avionics Co Ltd 360× monitoring system
JP2002166380A (en) * 2000-12-01 2002-06-11 Mitsubishi Heavy Ind Ltd Robot visual device
JP2002218294A (en) * 2001-01-17 2002-08-02 Nec Corp Method and device for picking up image over wide field angle range
JP2004353281A (en) * 2003-05-29 2004-12-16 Hitachi Constr Mach Co Ltd Visual field expansion device for construction machine
CN103294057A (en) * 2012-02-24 2013-09-11 三星电子株式会社 Sensor assembly and robot cleaner having the same
CN110324571A (en) * 2018-03-29 2019-10-11 株式会社日立制作所 Moving body photographic device and moving body image capture method
CN209486459U (en) * 2019-03-22 2019-10-11 厦门集长新材料科技有限公司 Single camera panorama shooting device
CN111077915A (en) * 2019-12-27 2020-04-28 成都英飞睿技术有限公司 Panoramic monitoring control method, device and equipment and readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07322104A (en) * 1994-05-24 1995-12-08 A S I Kk Monitoring device
JPH08194809A (en) * 1995-01-20 1996-07-30 Nippon Avionics Co Ltd 360× monitoring system
JP2002166380A (en) * 2000-12-01 2002-06-11 Mitsubishi Heavy Ind Ltd Robot visual device
JP2002218294A (en) * 2001-01-17 2002-08-02 Nec Corp Method and device for picking up image over wide field angle range
JP2004353281A (en) * 2003-05-29 2004-12-16 Hitachi Constr Mach Co Ltd Visual field expansion device for construction machine
CN103294057A (en) * 2012-02-24 2013-09-11 三星电子株式会社 Sensor assembly and robot cleaner having the same
CN110324571A (en) * 2018-03-29 2019-10-11 株式会社日立制作所 Moving body photographic device and moving body image capture method
CN209486459U (en) * 2019-03-22 2019-10-11 厦门集长新材料科技有限公司 Single camera panorama shooting device
CN111077915A (en) * 2019-12-27 2020-04-28 成都英飞睿技术有限公司 Panoramic monitoring control method, device and equipment and readable storage medium

Similar Documents

Publication Publication Date Title
EP2973414B1 (en) Apparatus for generation of a room model
CN108227914B (en) Transparent display device, control method using the same, and controller thereof
CN108140235B (en) System and method for generating visual display of image
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
US11397245B2 (en) Surveying instrument for scanning an object and for projection of information
EP3246660A1 (en) System and method for referencing a displaying device relative to a surveying instrument
US20140160012A1 (en) Automatic correction device of vehicle display system and method thereof
JP2014529727A (en) Automatic scene calibration
WO2020053936A1 (en) Camera installation support device and method, installation angle calculation method, program, and recording medium
US20190236847A1 (en) Method and system for aligning digital display of images on augmented reality glasses with physical surrounds
JP4132068B2 (en) Image processing apparatus, three-dimensional measuring apparatus, and program for image processing apparatus
EP3821780A1 (en) Mobile robot
WO2020151268A1 (en) Generation method for 3d asteroid dynamic map and portable terminal
JP2018139084A (en) Device, moving object device and method
US20160037154A1 (en) Image processing system and method
WO2021258251A1 (en) Surveying and mapping method for movable platform, and movable platform and storage medium
CN111811462A (en) Large-component portable visual ranging system and method in extreme environment
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
JP2023505891A (en) Methods for measuring environmental topography
CN111664839A (en) Vehicle-mounted head-up display virtual image distance measuring method
CN113805591A (en) Visual system and method for synchronous positioning and mapping and mobile robot
CN113496503B (en) Point cloud data generation and real-time display method, device, equipment and medium
Gehrig et al. 6D vision goes fisheye for intersection assistance
KR20090047145A (en) Method for detecting invisible obstacle of robot
Iguchi et al. Omni-directional 3D measurement using double fish-eye stereo vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination