WO2019105009A1 - Method and system for synchronized scanning of space - Google Patents

Method and system for synchronized scanning of space Download PDF

Info

Publication number
WO2019105009A1
WO2019105009A1 PCT/CN2018/091583 CN2018091583W WO2019105009A1 WO 2019105009 A1 WO2019105009 A1 WO 2019105009A1 CN 2018091583 W CN2018091583 W CN 2018091583W WO 2019105009 A1 WO2019105009 A1 WO 2019105009A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
scanning devices
scanning
capturing module
space
Prior art date
Application number
PCT/CN2018/091583
Other languages
French (fr)
Inventor
Seng Fook LEE
Original Assignee
Guangdong Kang Yun Technologies Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Kang Yun Technologies Limited filed Critical Guangdong Kang Yun Technologies Limited
Publication of WO2019105009A1 publication Critical patent/WO2019105009A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C22/00Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images

Definitions

  • the present invention relates to scanning of an environment, more particularly, the present invention relates to synchronizing scanning of an indoor environment and generating a complete information of an indoor environment in real time using multiple scanning devices.
  • a method for synchronized scanning of a space by a plurality of scanning devices includes the steps of collecting, by each of the plurality of scanning devices, a surrounding environment information using a first data capturing module. Further, the method includes Collecting, by each of the plurality of scanning devices, movement data of the each of the plurality of scanning devices using a second data capturing module. The surrounding environment data and the movement data are combined together by a processor to generate a geometrical proximity data.
  • the method includes generating, by each of the plurality of scanning devices, a point cloud data information wherein point cloud data is generated using a pictorial information of the surroundings and creating, a point cloud structure data, from data captured by a third data capturing module of the surrounding space around the scanning device. Further, the method includes determining coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices of the plurality of scanning devices and stitching the point cloud data of each of the plurality of scanning devices in real-time to generate a 3-dimensional mapping image of the space.
  • each of the plurality of scanning devices further comprises a first data capturing module configured to capture surrounding environment information, a second data capturing module configured to capture movement data of a scanning device, and a third data capturing module configured to capture pictorial information of the surrounding environment.
  • the system further includes a server, connected to each of the plurality of scanning devices configured to receive surrounding environment data, movement data and pictorial information from each of the plurality of scanning devices, combine surrounding data and the movement data to generate a geometrical proximity data, generate a point cloud data, from the pictorial information, determine coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices, stitch point cloud data of each of the plurality of scanning devices in real-time, and generate a 3-dimensional mapping image of the space.
  • a server connected to each of the plurality of scanning devices configured to receive surrounding environment data, movement data and pictorial information from each of the plurality of scanning devices, combine surrounding data and the movement data to generate a geometrical proximity data, generate a point cloud data, from the pictorial information, determine coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices, stitch point cloud data of each of the plurality of scanning devices in real-time, and generate a 3-dimensional mapping
  • FIG. 1 is a block diagram depicting a scanning device for scanning a space, in accordance with an embodiment of the invention
  • FIG. 2 is a block diagram of an autonomous robot and its various internal components, in accordance with another embodiment of the invention.
  • FIG. 3A is a block diagram depicting a space to be scanned and synchronization, in accordance with an embodiment of the invention
  • FIG. 3B is a block diagram depicting a space to be scanned and synchronization, in accordance with another embodiment of the invention.
  • FIG. 4 is a block diagram depicting a server and its internal components, in accordance with an embodiment of the invention.
  • FIG. 5 is a flow chart depicting a synchronized scanning method, in accordance with an embodiment of the invention.
  • the robot includes a main frame 110, and a plurality of legs 112.
  • the main frame 110 supports the third data capturing module or the plurality of cameras 106A-N (cumulatively referred to as cameras 106).
  • the main frame may be made up of any one or a combination of a wood, metal, alloy, plastic, rubber, or fiber.
  • the legs 112 provides a reachable and scannable height to the robot 100.
  • the plurality of cameras may include fish eye lenses in order to capture a spherical view of the corresponding region.
  • the main frame 110 includes the first data capturing module 102, the second data capturing module 104, and the third data capturing module 106.
  • each leg of the plurality of legs 112 of the robot 100 may include at least one second data capturing module 104. The placement may be strategic so as to closely map the movement behavior of the robot 100.
  • the plurality of legs may include means for movement that is wheels that may help the robot 100 to be maneuvered or maneuver itself to any direction hassle freely.
  • the supporting frame 110 may be of any shape and size.
  • the shape depicted in figure is for illustration purposes and should not be considered as restrictive for the invention in any manner.
  • the first data capturing module 102 may be mounted on the top of the robot 100.
  • FIG. 2 a block diagram of a scanning device 100 and its internal components to for scanning a space, in accordance with an embodiment of the invention.
  • the system 100 may be an autonomous robot 100, which will be interchangeably used in the following detailed description.
  • the autonomous robot 100 may also be a manually controlled or autonomously or a combination of both.
  • the autonomous robot 100 may also be controlled using a mobile application.
  • the autonomous robot 100 includes a first data capturing module 202, a second data capturing module 204, a third data capturing module 206, and a processor 210.
  • the first data capturing module 202 may be a stereo vision or a LiDAR.
  • LiDAR is more suited for its capabilities to identify and map objects very close to the robot 100.
  • LiDAR may be placed on the top of the robot 100 in order to enable the robot 100 to scan the entire environment.
  • the second data capturing module 204 may be odometers sensors that capture movement data of the autonomous robot 100.
  • the odometer sensors are capable of identifying how far the robot has moved and at what distance is traversed by the robot 100. Also, the odometer sensors are capable enough to identify the accurate movement of the robot 100.
  • the third data capturing module 206 may be a plurality of cameras. There may be at least 2 cameras to capture pictorial information of the environment surrounding the robot 100.
  • the cameras may include first eye lenses that capture the data in a spherical manner.
  • the processor 210 may be a standalone processor with various hardware internal processing components, or the internal components may also be software generated.
  • the processing module 210 may be a collection of processors function in together to achieve the function equivalent to the standalone processor.
  • FIG. 3A a block diagram depicting an environment 300 wherein a space 302 is being scanned and synchronization of the scanned information, in accordance with an embodiment of the invention.
  • the space 300 may be an indoor exhibition hall or space.
  • the space 302 may include multiple areas to be scanned like areas 302A, 302B, 302C and 302D. Each of the multiple areas may be scanned using at least one of the plurality of scanning devices 100A, 100B, 100C, and 100D.
  • Each of the scanning devices 100A, 100B, 100C, and 100D may be connected to a central server 304.
  • the server 304 may receive information from each of the plurality of scanning devices 100A, 100B, 100C, and 100D. Details of the server and functioning will be explained later in conjunction with detailed description of FIG. 4.
  • FIG. 3B a block diagram depicting an environment 300 wherein the space 302 is being scanned and synchronization of the scanned information is being done using swarm intelligence algorithms, in accordance with another embodiment of the invention.
  • each of the scanning devices 100A, 100B, 100C, and 100D communicate with each other using swarm intelligence. This helps each of the scanning devices 100A, 100B, 100C, and 100D to be aware of each other’s work and area of work thus making synchronization of the data easy.
  • the scanning devices 100A, 100B, 100C, and 100D may be connected to each other through wired or wireless networks. Wired networks may be through local area network connections.
  • the scanning devices 100A, 100B, 100C, and 100D may use wireless protocols like Wi-Fi, Bluetooth, ZigBee, Near field communication (NFC) etc. They may share data with each other about their specific placements, area scanned, next area for coverage etc.
  • Swarm intelligence lets the scanning devices 100A, 100B, 100C, and 100D to interact with the surrounding environment and with each other to synchronize the scanning procedures. The interaction may be in a particular manner or may be completely random to allow scanning devices 100A, 100B, 100C, and 100D share with each other at random so that every scanned device is updated about area not covered or what has been covered and add in its own database to enable to build a complete 3-d image of the surrounding environment.
  • FIG. 4 a block diagram of the server 304 and its internal components is illustrated, configured to receive data from each of the plurality of scanning devices 100A, 100B, 100C, and 100D, in accordance with an embodiment of the invention.
  • the server 304 includes a data acquisition module 3042, a geometrical proximity data generator 3044, a point cloud data generator 3046, a coverage area identifier 3048, and a composite image generator 3050.
  • the data acquisition module 3042 is configured to receive data from the plurality of data capturing modules 100A, 100B, 100C, and 100D. Each of the data received from the plurality of data capturing modules 100A, 100B, 100C, and 100D received may be also stored in a memory (not shown in the figure) segregated by data type.
  • the data acquisition module 3042 formats the data received to a format readable by the processor 304 and its mother modules.
  • the geometrical proximity data generator 3044 receives the data of the first data capturing module 102 and the second data capturing module 104. Both the data are combined to generate a geometrical proximity data.
  • the geometrical proximity data defines internal mapping of the space and various distance details.
  • the point cloud data generator 3046 converts the pictorial data captured by the third data capturing module 204 into a point cloud data to identify various objects within the scanned space 302 efficiently.
  • the coverage area identifier 3048 uses the generated geometrical proximity data to identify area scanned by the scanning device. The coverage area identified may be tagged as covered by a specific scanning device. This data may be then stored in the memory and other data may be compared to this data to identify overlapping areas etc.
  • the composite image generator 3050 combines point cloud data generated from the data collected from the plurality of data capturing modules 100A, 100B, 100C, and 100D.
  • the combined data provides a 3-dimensional image of the space 302.
  • the 3-dimensional image may be then forwarded to a display 306 of a user device (not shown in the figure).
  • FIG. 5 illustrating a flow chart diagram depicting a synchronized scanning method 500 for scanning the space 302 by the autonomous robot 100, in accordance with an embodiment of the invention.
  • the method may include, but not limited, to the steps mentioned below.
  • the method 500 initiates at step 502 wherein surrounding environment information is mapped by the first data capturing module 102. Further, the method 500 (at step 504), captures movement data of the robot 100 performing the scanning function.
  • surrounding environment data and the movement data are combined together to form a geometrical proximity data. Further, at step 508, pictorial information of the surrounding environment is also captured. At step 510, the pictorial information is used to generate a point cloud data.
  • the coverage area of the space 302 being scanned is determined using the generated geometrical proximity data.
  • the point cloud data is stitched together. Further at step 516, the stitched point cloud data is used to generate a single 3-dimensional or 2-dimensional view is formed, in real-time, that may be transmitted to a display of the user device.
  • This process is repeated at regular intervals until the scanning is completed and there is no change in subsequent updated information for a certain number of data capture cycles. Also, the scanning may be stopped manually from the remote device as described above.
  • the processor 210 may be configured to generate geometrical proximity data and the point cloud data and send it to the server 304 for further processing that is 3-dimensioanl view generation in real time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Toys (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A system and a method for synchronized scanning of a space are disclosed. The system includes a plurality of scanning devices. Each of the plurality of scanning devices includes a first, second and a third data capturing module. The data captured from the first and second data capturing module is utilized to generate a geometrical proximity data. The geometrical proximity data is utilized to define coverage area of the each of the plurality of scanning devices for synchronized scanning. Further, the data captured using the third data capturing module of each of the plurality of scanning devices, is stitched and combined in real time to generate a 3-dimensional view of the space being scanned.

Description

[Title established by the ISA under Rule 37.2] METHOD AND SYSTEM FOR SYNCHRONIZED SCANNING OF SPACE
FIELD OF INVENTION
The present invention relates to scanning of an environment, more particularly, the present invention relates to synchronizing scanning of an indoor environment and generating a complete information of an indoor environment in real time using multiple scanning devices.
Summary
According to an embodiment of the invention, there is provided a method for synchronized scanning of a space by a plurality of scanning devices. The method includes the steps of collecting, by each of the plurality of scanning devices, a surrounding environment information using a first data capturing module. Further, the method includes Collecting, by each of the plurality of scanning devices, movement data of the each of the plurality of scanning devices using a second data capturing module. The surrounding environment data and the movement data are combined together by a processor to generate a geometrical proximity data. Furthermore, the method includes generating, by each of the plurality of scanning devices, a point cloud data information wherein point cloud data is generated using a pictorial information of the surroundings and creating, a point cloud structure data, from data captured by a third data capturing module of the surrounding space around the scanning device. Further, the method includes determining coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices of the plurality of scanning devices and stitching the point cloud data of each of the plurality of scanning devices in real-time to generate a 3-dimensional mapping image of the space.
According to another embodiment of the invention, there is provided system for synchronized scanning a space. The system includes a plurality of scanning devices, wherein each of the plurality of scanning devices further comprises a first data capturing module configured to capture surrounding environment information, a second data capturing module configured to capture movement data of a scanning device, and a third data capturing module configured to capture pictorial information of the surrounding environment. The system further includes a server, connected to each of the plurality of scanning devices configured to receive surrounding environment data, movement data and pictorial information from each of the plurality of scanning devices, combine surrounding data and the movement data to generate a geometrical proximity data, generate a point cloud data, from the pictorial information, determine coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices, stitch point cloud data of each of the plurality of scanning devices in real-time, and generate a 3-dimensional mapping image of the space.
An object of the presently disclosed subject matter having been stated herein above, and which is achieved in whole or in part by the presently disclosed subject matter, other objects will become evident as the description proceeds when taken in connection with the accompanying drawings as best described herein below.
Brief Description of the Drawings
The foregoing summary, as well as the following detailed description of various embodiments, is better understood when read in conjunction with the appended drawings. For the purposes of illustration, there is shown in the drawings exemplary embodiments; however, the presently disclosed subject matter is not limited to the specific methods and instrumentalities disclosed. In the drawings:
FIG. 1 is a block diagram depicting a scanning device for scanning a space, in accordance with an embodiment of the invention;
FIG. 2 is a block diagram of an autonomous robot and its various internal components, in accordance with another embodiment of the invention;
FIG. 3A is a block diagram depicting a space to be scanned and synchronization, in accordance with an embodiment of the invention;
FIG. 3B is a block diagram depicting a space to be scanned and synchronization, in accordance with another embodiment of the invention;
FIG. 4 is a block diagram depicting a server and its internal components, in accordance with an embodiment of the invention; and
FIG. 5 is a flow chart depicting a synchronized scanning method, in accordance with an embodiment of the invention.
Detailed Description
The disclosure set forth above may encompass multiple distinct inventions with independent utility. Although each of these inventions has been disclosed in its preferred form(s), the specific embodiments thereof as disclosed and illustrated herein are not to be considered in a limiting sense, because numerous variations are possible. The subject matter of the inventions includes all novel and nonobvious combinations and subcombinations of the various elements, features, functions, and/or properties disclosed herein. The following claims particularly point out certain combinations and subcombinations regarded as novel and nonobvious. Inventions embodied in other combinations and subcombinations of features, functions, elements, and/or properties may be claimed in applications claiming priority from this or a related application. Such claims, whether directed to a different invention or to the same invention, and whether broader, narrower, equal, or different in scope to the original claims, also are regarded as included within the subject matter of the inventions of the present disclosure.
Now referring to FIG. 1, illustrating a scanning device 100 (autonomous robot used interchangeably) and its various parts, in accordance with another embodiment of the invention. The robot includes a main frame 110, and a plurality of legs 112. The main frame 110 supports the third data capturing module or the plurality of cameras 106A-N (cumulatively referred to as cameras 106). The main frame may be made up of any one or a combination of a wood, metal, alloy, plastic, rubber, or fiber. The legs 112 provides a reachable and scannable height to the robot 100. In an embodiment of the invention the plurality of cameras may include fish eye lenses in order to capture a spherical view of the corresponding region. The main frame 110, includes the first data capturing module 102, the second data capturing module 104, and the third data capturing module 106. In an aspect of the invention, each leg of the plurality of legs 112 of the robot 100 may include at least one second data capturing module 104. The placement may be strategic so as to closely map the movement behavior of the robot 100.
In another embodiment of the invention, the plurality of legs may include means for movement that is wheels that may help the robot 100 to be maneuvered or maneuver itself to any direction hassle freely.
In another embodiment of the invention, the supporting frame 110 may be of any shape and size. The shape depicted in figure is for illustration purposes and should not be considered as restrictive for the invention in any manner.
In another embodiment of the invention, the first data capturing module 102 may be mounted on the top of the robot 100.
Now referring to FIG. 2, a block diagram of a scanning device 100 and its internal components to for scanning a space, in accordance with an embodiment of the invention. The system 100 may be an autonomous robot 100, which will be interchangeably used in the following detailed description. The autonomous robot 100 may also be a manually controlled or autonomously or a combination of both. The autonomous robot 100 may also be controlled using a mobile application. The autonomous robot 100 includes a first data capturing module 202, a second data capturing module 204, a third data capturing module 206, and a processor 210.
According to an aspect of the invention, the first data capturing module 202 may be a stereo vision or a LiDAR. For carrying out the invention in best mode, LiDAR is more suited for its capabilities to identify and map objects very close to the robot 100. LiDAR may be placed on the top of the robot 100 in order to enable the robot 100 to scan the entire environment.
According to another aspect of the invention, the second data capturing module 204 may be odometers sensors that capture movement data of the autonomous robot 100. The odometer sensors are capable of identifying how far the robot has moved and at what distance is traversed by the robot 100. Also, the odometer sensors are capable enough to identify the accurate movement of the robot 100.
According to another aspect of the invention, the third data capturing module 206 may be a plurality of cameras. There may be at least 2 cameras to capture pictorial information of the environment surrounding the robot 100. The cameras may include first eye lenses that capture the data in a spherical manner. According to another aspect, there may be present 4 cameras configured towards each direction, to capture pictorial information in 360 degrees in an instant.
The processor 210, may be a standalone processor with various hardware internal processing components, or the internal components may also be software generated. The processing module 210 may be a collection of processors function in together to achieve the function equivalent to the standalone processor.
Now referring to FIG. 3A, a block diagram depicting an environment 300 wherein a space 302 is being scanned and synchronization of the scanned information, in accordance with an embodiment of the invention. The space 300 may be an indoor exhibition hall or space. The space 302 may include multiple areas to be scanned like areas 302A, 302B, 302C and 302D. Each of the multiple areas may be scanned using at least one of the plurality of scanning devices 100A, 100B, 100C, and 100D. Each of the scanning devices 100A, 100B, 100C, and 100D may be connected to a central server 304. The server 304 may receive information from each of the plurality of scanning devices 100A, 100B, 100C, and 100D. Details of the server and functioning will be explained later in conjunction with detailed description of FIG. 4.
Now referring to FIG. 3B, a block diagram depicting an environment 300 wherein the space 302 is being scanned and synchronization of the scanned information is being done using swarm intelligence algorithms, in accordance with another embodiment of the invention. In this embodiment, each of the scanning devices 100A, 100B, 100C, and 100D communicate with each other using swarm intelligence. This helps each of the scanning devices 100A, 100B, 100C, and 100D to be aware of each other’s work and area of work thus making synchronization of the data easy. The scanning devices 100A, 100B, 100C, and 100D may be connected to each other through wired or wireless networks. Wired networks may be through local area network connections. Whereas for wireless networks, the scanning devices 100A, 100B, 100C, and 100D may use wireless protocols like Wi-Fi, Bluetooth, ZigBee, Near field communication (NFC) etc. They may share data with each other about their specific placements, area scanned, next area for coverage etc. Swarm intelligence lets the scanning devices 100A, 100B, 100C, and 100D to interact with the surrounding environment and with each other to synchronize the scanning procedures. The interaction may be in a particular manner or may be completely random to allow scanning devices 100A, 100B, 100C, and 100D share with each other at random so that every scanned device is updated about area not covered or what has been covered and add in its own database to enable to build a complete 3-d image of the surrounding environment.
Now referring to FIG. 4, a block diagram of the server 304 and its internal components is illustrated, configured to receive data from each of the plurality of scanning devices 100A, 100B, 100C, and 100D, in accordance with an embodiment of the invention.
The server 304 includes a data acquisition module 3042, a geometrical proximity data generator 3044, a point cloud data generator 3046, a coverage area identifier 3048, and a composite image generator 3050.
The data acquisition module 3042 is configured to receive data from the plurality of data capturing modules 100A, 100B, 100C, and 100D. Each of the data received from the plurality of data capturing modules 100A, 100B, 100C, and 100D received may be also stored in a memory (not shown in the figure) segregated by data type. The data acquisition module 3042 formats the data received to a format readable by the processor 304 and its mother modules. The geometrical proximity data generator 3044 receives the data of the first data capturing module 102 and the second data capturing module 104. Both the data are combined to generate a geometrical proximity data. The geometrical proximity data defines internal mapping of the space and various distance details. Further, the point cloud data generator 3046, converts the pictorial data captured by the third data capturing module 204 into a point cloud data to identify various objects within the scanned space 302 efficiently. The coverage area identifier 3048, uses the generated geometrical proximity data to identify area scanned by the scanning device. The coverage area identified may be tagged as covered by a specific scanning device. This data may be then stored in the memory and other data may be compared to this data to identify overlapping areas etc.
Further, the composite image generator 3050 combines point cloud data generated from the data collected from the plurality of data capturing modules 100A, 100B, 100C, and 100D. The combined data provides a 3-dimensional image of the space 302. The 3-dimensional image may be then forwarded to a display 306 of a user device (not shown in the figure).
Now referring to FIG. 5, illustrating a flow chart diagram depicting a synchronized scanning method 500 for scanning the space 302 by the autonomous robot 100, in accordance with an embodiment of the invention. The method may include, but not limited, to the steps mentioned below.
The method 500 initiates at step 502 wherein surrounding environment information is mapped by the first data capturing module 102. Further, the method 500 (at step 504), captures movement data of the robot 100 performing the scanning function.
At step 506, surrounding environment data and the movement data are combined together to form a geometrical proximity data. Further, at step 508, pictorial information of the surrounding environment is also captured. At step 510, the pictorial information is used to generate a point cloud data.
At step 512, the coverage area of the space 302 being scanned is determined using the generated geometrical proximity data. At step 514, the point cloud data is stitched together. Further at step 516, the stitched point cloud data is used to generate a single 3-dimensional or 2-dimensional view is formed, in real-time, that may be transmitted to a display of the user device.
This process is repeated at regular intervals until the scanning is completed and there is no change in subsequent updated information for a certain number of data capture cycles. Also, the scanning may be stopped manually from the remote device as described above.
In an embodiment of the invention, the processor 210 may be configured to generate geometrical proximity data and the point cloud data and send it to the server 304 for further processing that is 3-dimensioanl view generation in real time.

Claims (10)

  1. A method for synchronized scanning of a space by a plurality of scanning devices, comprising;
    Collecting, by each of the plurality of scanning devices, a surrounding environment information using a first data capturing module;
    Collecting, by each of the plurality of scanning devices, movement data of the each of the plurality of scanning devices using a second data capturing module; and
    Combining, by a processor, the surrounding environment data and the movement data to generate a geometrical proximity data;
    Generating, by each of the plurality of scanning devices, a point cloud data information wherein point cloud data is generated using a pictorial information of the surroundings;
    Determining, by a server, coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices; and
    Stitching, by the server, point cloud data of each of the plurality of scanning devices in real-time; and
    Generating a 3-dimensional mapping image of the space.
  2. The method of claim 1, wherein each of the plurality of scanning devices are autonomous robots.
  3. The method of claim 2, wherein each of the autonomous robots receives manual or automatic instructions.
  4. The method of claim 3, wherein the instructions are used to synchronize scanning of the space.
  5. A system for synchronized scanning of a space comprising;
    A plurality of scanning devices, wherein each of the plurality of scanning devices further comprises;
    A first data capturing module configured to capture surrounding environment information;
    A second data capturing module configured to capture movement data of a scanning device; and
    A third data capturing module configured to capture pictorial information of the surrounding environment;
    A server, connected to each of the plurality of scanning devices, configured to;
    Receive surrounding environment data, movement data and pictorial information from each of the plurality of scanning devices;
    Combine surrounding data and the movement data to generate a geometrical proximity data;
    Generate a point cloud data, from the pictorial information;
    Determine coverage area of each of the plurality of scanning devices based on geometrical proximity data of each of the plurality of scanning devices;
    Stitch point cloud data of each of the plurality of scanning devices in real-time; and
    Generate a 3-dimensional mapping image of the space
  6. The system of claim 5, wherein the server further coordinates communication between each of the plurality of scanning devices.
  7. The system of claim 6, wherein the communication is initiated manually or automatically.
  8. The system of claim 5, wherein the first data capturing module is a LiDAR sensor.
  9. The system of claim 5, wherein the second data capturing module is a plurality of odometer sensors.
  10. The system of claim 5, wherein the third data capturing module is a camera or a RGB-D camera.
PCT/CN2018/091583 2017-12-03 2018-06-15 Method and system for synchronized scanning of space WO2019105009A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762593953P 2017-12-03 2017-12-03
US62/593,953 2017-12-03

Publications (1)

Publication Number Publication Date
WO2019105009A1 true WO2019105009A1 (en) 2019-06-06

Family

ID=63007250

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/091583 WO2019105009A1 (en) 2017-12-03 2018-06-15 Method and system for synchronized scanning of space

Country Status (2)

Country Link
CN (2) CN108364340A (en)
WO (1) WO2019105009A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019206393A1 (en) * 2019-05-03 2020-11-05 BSH Hausgeräte GmbH Management of a building
CN113607051B (en) * 2021-07-24 2023-12-12 全图通位置网络有限公司 Acquisition method, system and storage medium of non-exposure space digital data
CN114777671A (en) * 2022-04-25 2022-07-22 武汉中观自动化科技有限公司 Workpiece model processing method, server, front-end equipment and three-dimensional scanning system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120215354A1 (en) * 2009-10-27 2012-08-23 Battelle Memorial Institute Semi-Autonomous Multi-Use Robot System and Method of Operation
CN102679959A (en) * 2012-05-03 2012-09-19 浙江工业大学 Omnibearing 3D (Three-Dimensional) modeling system based on initiative omnidirectional vision sensor
CN102831646A (en) * 2012-08-13 2012-12-19 东南大学 Scanning laser based large-scale three-dimensional terrain modeling method
CN103123727A (en) * 2011-11-21 2013-05-29 联想(北京)有限公司 Method and device for simultaneous positioning and map building
US20140232717A1 (en) * 2013-02-20 2014-08-21 Google Inc. Merging Three-Dimensional Models of Varying Resolution

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8457790B2 (en) * 2007-09-14 2013-06-04 Zimmer, Inc. Robotic calibration method
JP6181388B2 (en) * 2013-03-08 2017-08-16 株式会社トプコン Measuring device
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device
CN106325268A (en) * 2015-06-30 2017-01-11 芋头科技(杭州)有限公司 Mobile control device and mobile control method
CN107121062A (en) * 2016-12-07 2017-09-01 苏州笛卡测试技术有限公司 A kind of robot three-dimensional scanning means and method
CN106959697B (en) * 2017-05-16 2023-05-23 电子科技大学中山学院 Automatic indoor map construction system for long straight corridor environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120215354A1 (en) * 2009-10-27 2012-08-23 Battelle Memorial Institute Semi-Autonomous Multi-Use Robot System and Method of Operation
CN103123727A (en) * 2011-11-21 2013-05-29 联想(北京)有限公司 Method and device for simultaneous positioning and map building
CN102679959A (en) * 2012-05-03 2012-09-19 浙江工业大学 Omnibearing 3D (Three-Dimensional) modeling system based on initiative omnidirectional vision sensor
CN102831646A (en) * 2012-08-13 2012-12-19 东南大学 Scanning laser based large-scale three-dimensional terrain modeling method
US20140232717A1 (en) * 2013-02-20 2014-08-21 Google Inc. Merging Three-Dimensional Models of Varying Resolution

Also Published As

Publication number Publication date
CN108364340A (en) 2018-08-03
CN208751479U (en) 2019-04-16

Similar Documents

Publication Publication Date Title
CN107113415B (en) The method and apparatus for obtaining and merging for more technology depth maps
US20180056801A1 (en) Robotic Charger Alignment
CN112270754B (en) Local grid map construction method and device, readable medium and electronic equipment
WO2019105009A1 (en) Method and system for synchronized scanning of space
CN107533376A (en) It is couple to the consumer cameras of the privacy-sensitive of augmented reality system
CN108789421B (en) Cloud robot interaction method based on cloud platform, cloud robot and cloud platform
TWI610250B (en) Monitor system and operation method thereof
CN110230983A (en) Antivibration formula optical 3-dimensional localization method and device
WO2017027338A1 (en) Apparatus and method for supporting interactive augmented reality functionalities
US12033355B2 (en) Client/server distributed camera calibration
CN103901884A (en) Information processing method and information processing device
CN106780629A (en) A kind of three-dimensional panorama data acquisition, modeling method
CN102831816B (en) Device for providing real-time scene graph
CN109668545A (en) Localization method, locator and positioning system for head-mounted display apparatus
CN207067803U (en) A kind of mobile electronic device for being used to handle the task of mission area
CN108446596A (en) Iris 3D 4 D datas acquisition system based on Visible Light Camera matrix and method
KR20240117577A (en) Scanning data processing methods, devices, electronic devices and media
CN110275179A (en) A kind of building merged based on laser radar and vision ground drawing method
CN116958146B (en) Acquisition method and device of 3D point cloud and electronic device
WO2021068070A1 (en) Electronic tracking device for camera and related system for controlling image output of the camera
JPWO2019065260A1 (en) Information processing equipment, information processing methods, programs, and interchangeable lenses
Wang et al. Isprs benchmark on multisensory indoor mapping and positioning
JP2022057771A (en) Communication management device, image communication system, communication management method, and program
CN114777772A (en) Indoor positioning system based on infrared camera and high accuracy IMU
CN110122958A (en) Three-dimensional scanner and application method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18884760

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11.09.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18884760

Country of ref document: EP

Kind code of ref document: A1