CN111328017A - Map transmission method and device - Google Patents
Map transmission method and device Download PDFInfo
- Publication number
- CN111328017A CN111328017A CN202010099669.6A CN202010099669A CN111328017A CN 111328017 A CN111328017 A CN 111328017A CN 202010099669 A CN202010099669 A CN 202010099669A CN 111328017 A CN111328017 A CN 111328017A
- Authority
- CN
- China
- Prior art keywords
- map data
- real
- wifi module
- time
- transmission method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a map data transmission method and a device, wherein the method comprises the following steps: acquiring real-time map data and storing the real-time map data in a cache area of the WiFi module, wherein the cache area comprises a volatile cache area and a non-volatile cache area; controlling the WiFi module to send the real-time map data to a cloud end; acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module; and controlling the WiFi module to send the corrected map data to the cloud. Through set up nonvolatile buffer on the wiFi module, improve the transmission mode of data, ensure that the terminal acquires complete, accurate map.
Description
Technical Field
The invention relates to the technical field of robots, in particular to a map transmission method and device.
Background
At present, household sweeping robots are increasingly popularized, and comprise laser robots, visual robots and gyroscope robots, wherein the laser robots can obtain accurate sweeping maps, but the cost is relatively high. The accurate floor sweeping map is obtained, so that follow-up designated cleaning places, virtual walls, designated cleaning areas and designated cleaning forbidden areas are conveniently set.
The wifi transmission module of the robot of sweeping floor among the prior art has a memory function, can be used to store some record drawings of sweeping floor, and the data of storage can be uploaded to the high in the clouds through this wifi transmission module, and then the user can follow the high in the clouds through APP and download the record drawings of sweeping floor and look over.
However, when the sweeping robot slips or touches an obstacle and stops before stopping, wheels of the robot rotate according to the rotation, the wheel type odometer performs normal data acquisition, and a constructed map has deviation. Therefore, calibration of the sweep log is also required. In addition, the bandwidth and the memory capacity which can be borne by the cloud end and the robot end are limited, and the image data is generally large, which brings difficulty to the recording and friendly display of the floor sweeping record map.
The robot of sweeping the floor is through passing through the WIFI module with the map data of sweeping the floor to high in the clouds transmission at present. Due to the limitation of the CPU processing capacity, the data transmission rate, the memory and the like of the WIFI module, the WIFI module cannot transmit complete map data to the cloud end at one time generally, and the map data need to be transmitted after being finished or compressed in multiple times. If a multi-transmission mode is adopted, under the condition of poor network condition, the condition of data transmission loss is easy to occur, and the data obtained by the cloud end is incomplete; and along with the continuous increase of transmission data, the storage capacity of the sweeping robot is limited, and a large amount of data transmission can not be realized in a simple compression uploading mode.
The above background disclosure is only for the purpose of assisting understanding of the concept and technical solution of the present invention and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
Disclosure of Invention
The invention provides a map transmission method and a map transmission device for solving the existing problems.
In order to solve the above problems, the technical solution adopted by the present invention is as follows:
a map data transmission method comprises the following steps: s1: acquiring real-time map data and storing the real-time map data in a cache area of a WiFi module, wherein the cache area comprises a volatile cache area and a non-volatile cache area; s2: controlling the WiFi module to send the real-time map data to a cloud end; s3: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module; s4: and controlling the WiFi module to send the corrected map data to the cloud.
Preferably, the method further comprises the following steps: s5: and controlling a cache region of the WiFi module to delete invalid map data, wherein the invalid map data are sent to the cloud end, and the cloud end correctly combines complete map data.
Preferably, the real-time map data is acquired according to the trajectory data of the walking trajectory.
Preferably, the step of judging whether the real-time map data is accurate comprises the following steps: s31: comparing the visual image of the current position with the visual image before the current moment, or comparing each frame of image in the video of the current position, and judging whether the position changes or not to obtain a position change result; s32: judging whether the position change result is consistent with the track data: if the map data are consistent, the real-time map data are accurate; and if the map data are inconsistent, the real-time map data are inaccurate.
Preferably, the change in position comprises a change in distance and/or a change in angle.
Preferably, the method further comprises compressing the real-time map data and/or the modified map data.
Preferably, the real-time map data and/or the modified map data are compressed by converting each grid within the effective area into a bitmap.
Preferably, the real-time map data or the corrected map data are sent to the cloud in blocks.
Preferably, the WiFi module is controlled to send the corrected map data to the cloud end according to a preset time interval; or when the real-time map data are judged to be inaccurate, the WiFi module is controlled to send the corrected map data to the cloud.
Preferably, the real-time map data or the corrected map data includes a sweeping area and a shape, a complete movement track, and obstacle information.
The invention also provides a computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of the above.
The invention also provides a map data transmission method, which comprises the following steps: t1: receiving real-time map data which are sent by a WiFi module and stored in a cache area of the WiFi module; t2: receiving correction map data which are sent by a WiFi module and stored in a cache area of the WiFi module; t3: deleting map data which do not accord with the corrected map data, and combining complete map data according to the corrected map data; t4: and transmitting the complete map data to a terminal.
Preferably, the WiFi module transmits the real-time map data or the modified map data in blocks; the map data not matching the correction map data is deleted and includes trajectory data.
Preferably, the cloud and the WiFi module perform data transmission through a first communication protocol; the cloud and the terminal perform data transmission through a second communication protocol; the first communication protocol is different from the second communication protocol.
The present invention further provides a map data transmission apparatus, comprising: a first obtaining module: acquiring real-time map data and storing the map data in a cache area of the WiFi module; a first control module: controlling the WiFi module to send the real-time map data to a cloud end; a second obtaining module: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module; a second control module: and controlling the WiFi module to send the corrected map data to the cloud.
The invention has the beneficial effects that: the method and the device for transmitting the map are provided, a nonvolatile cache region is arranged on a WiFi module, and receives sweeping map data transmitted by a robot and performs backup storage; under the condition of good network, the WIFI module normally and continuously sends the received data to the cloud end, and then the cloud end recombines the data into a complete map and transmits the map to the terminal; the map data stored in the nonvolatile cache region is the latest and complete map data all the time, so that the terminal is ensured to obtain a complete and accurate map.
Drawings
Fig. 1 is a schematic structural diagram of a visual sweeping robot according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a map data transmission method according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a method for determining whether real-time map data is accurate according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of another map data transmission method according to an embodiment of the present invention.
Fig. 5 is a diagram illustrating a still another method for transmitting map data according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a map data transmission apparatus according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating an actual floor plan in an embodiment of the present invention.
Fig. 8 is a map schematic diagram of the laser sweeping robot in the embodiment of the present invention.
Fig. 9 is a schematic diagram of a real-time map of the visual sweeping robot according to the embodiment of the present invention.
Fig. 10 is a schematic diagram of a corrected map of the visual sweeping robot according to the embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the embodiments of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and the embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. In addition, the connection may be for either a fixing function or a circuit connection function.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be in any way limiting of the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
The floor sweeping robot is also called an automatic sweeper, an intelligent dust collector, a robot dust collector and the like, is one of intelligent household appliances, and can automatically complete floor cleaning work in a room by means of certain artificial intelligence.
The operation map is drawn by the sweeping robot through a triangle positioning principle.
The virtual wall is a virtual wall, and the virtual wall can control the sweeping robot to work in a space area. The virtual wall may be a physical virtual wall, such as a magnetic stripe affixed to the floor; or an electronic virtual wall, such as a line segment drawn on an electronic job map by a user.
The non-cleaning area is also called as a non-sweeping area, and refers to an area where the sweeping robot does not perform cleaning operation.
The runnable area, also called sweeping area, refers to the area where the sweeping machine performs the sweeping operation.
In the prior art, a sweeping robot starts to use a machine vision technology in order to realize autonomous obstacle avoidance and route planning. The sweeping robot based on the machine vision technology can acquire images through the camera and plan a path and avoid obstacles based on the acquired images and a reasonable algorithm. Currently, a sweeping robot in the market adopts an instant positioning and mapping (SLAM) technology, a camera can be used for observing a space swept by the sweeping robot, a mark object and main characteristics in the swept space are identified, and a room map is drawn for navigation by a triangular positioning principle, so that the position of the robot in the swept space, the swept area, the non-swept area and the like are confirmed.
At present, the sweeping robot is mainly applied to a family place, and various living goods (such as tables, chairs, stools, garbage cans, doorsills, computer cables, toys for children) and the like are fully distributed in the family place, so that great obstacles are brought to the autonomous movement of the sweeping robot. The wheel sets of the sweeping robot are small, the obstacle crossing capability is poor (generally, obstacles exceeding 3cm are difficult to cross), and therefore the sweeping robot is often clamped by some obstacles or trapped in some narrow spaces, cannot get rid of the obstacles automatically, and needs manual intervention for rescue. In order to avoid that the sweeping robot is trapped again after the user rescues the sweeping robot, the user usually sets a forbidden area or a virtual wall on an operation map of the sweeping robot through a client of the sweeping robot at present.
When the wheel type odometer is used for determining that the robot moves, the position of the robot may not change, for example, when the robot hits a wall, wheels of the robot still rotate, the wheel type odometer performs normal data acquisition, then closed-loop detection is performed according to visual data and laser data, and after robot positioning and map construction are performed according to closed-loop detection results, the robot positioning is inaccurate, the constructed map is not in accordance with an actual scene, and the accuracy of the constructed map is reduced. Although the laser sweeping robot can obtain a relatively accurate map at present, the cost is higher. The invention aims to obtain a complete and correct map by adopting a vision sweeping robot with lower cost.
As shown in fig. 1, the vision sweeping robot 1 includes a robot main body 11 and a communication module 12, and the vision sweeping robot main body 1 adopts a Visual-based positioning and mapping (VSLM) and includes a vision device 13 and a track recording device 14.
In one embodiment of the present invention, the vision device 13 includes a camera assembly for capturing images or video. In an embodiment of the present invention, the vision sensor includes a camera, and is disposed in a traveling direction of the robot, and the camera may be any one of a main camera, a depth camera, a wide camera, and a telephoto camera. The trajectory recording device 14 includes a gyro sensor and an odometer. The vision device 13 is used for transmitting video signals of cleaning conditions, cleaning environments of the sweeper and the like to the main body or transmitting the video signals to the main body and the terminal simultaneously, and the angle of the camera can be adjusted up and down through the terminal, so that dual functions of observing cleaning effects and monitoring indoor environments are achieved. In one embodiment of the invention, the odometer is a wheel type odometer, and the number of turns of the left and right wheels of the robot and the rotation angle of the robot can be recorded, because the odometer data of the robot can be determined by the number of turns of the left and right wheels of the wheel type odometer and the rotation angle of the robot. The gyro sensor can detect the direction and the rotation angle of the body.
In an embodiment of the present invention, the sweeping robot may further include: the acceleration sensor can detect the acceleration on three coordinate axes of a coordinate system established by the robot; and the optical sensor is used for acquiring the intensity of ambient light.
The vision sweeping robot is provided with a vision device 13, and compared with a gyroscope robot, the vision sweeping robot has the advantages that map data are corrected more, but a large amount of data are generated very frequently through correction, and only k-level sub-packets can be transmitted from the vision sweeping robot to a cloud end; and the network condition is unstable and the transmission speed is not accurate. If a multi-transmission mode is adopted, under the condition of poor network condition, the condition of data transmission loss is easy to occur, and the data obtained by the cloud end is incomplete; and along with the continuous increase of transmission data, the storage capacity of the sweeping robot is limited, and a large amount of data transmission can not be realized in a simple compression uploading mode.
Based on the above scenario, the present invention provides a map data transmission method.
As shown in fig. 2, the present invention provides a map data transmission method, which includes the following steps:
s1: acquiring real-time map data and storing the real-time map data in a cache area of a WiFi module, wherein the cache area comprises a volatile cache area and a non-volatile cache area;
it can be understood that the cache area includes a memory and a hard disk, and data does not need to be saved after being sent to the cloud. The map data, also referred to as map information data, includes robot position coordinates, a cleaning route, and the like of the cleaning robot.
The sweeping robot can acquire the walking track of the sweeping robot through the track recording device, and real-time map data are acquired according to the track data of the walking track. When the visual sweeping robot works, the track recording device can record the walking track of the sweeping robot in a coordinate graph form to form a sweeping record graph; and simultaneously positioning And Mapping (SLAM).
S2: controlling the WiFi module to send the real-time map data to a cloud end;
s3: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module;
s4: and controlling the WiFi module to send the corrected map data to the cloud.
In an embodiment of the invention, the real-time map data or the corrected map data are sent to the cloud in blocks, which are limited by the processing capability, the data transmission rate, the memory and the like of the CPU of the WIFI module. In a specific embodiment, a real-time map data is transmitted in several blocks or packets, the first packet corresponding to a portion of the map data and the last packet corresponding to the entirety of the map data.
The WiFi module belongs to an internet of things transmission layer and has the function of converting a serial port or TTL level into an embedded module which accords with a WiFi wireless network communication standard and is internally provided with a wireless network protocol IEEE802.11b.g.n protocol stack and a TCP/IP protocol stack. The traditional hardware equipment is embedded into the WiFi module, the WiFi module can be directly used for accessing the Internet, and the hardware equipment is an important component for realizing application of Internet of things such as wireless smart home and M2M. The WiFi module adopts ESP32 series module. The ESP32 has the advantages of being a wireless module: the ESP32 provides a low power consumption scheme aiming at different environments and is a single chip integrating 2.4GHz WiFi and Bluetooth modules, so that a separate Bluetooth module is not needed, wherein two pins of the ESP32 are respectively Grounded (GND) and positive power Voltage (VDD), and then a data line (SDA) and a control line (SCL) are sequentially connected.
In one embodiment of the invention, the process of constructing the visual SLAM map by the visual robot comprises the following steps:
the origin and coordinate system of the visual SLAM map to be created are predetermined. The selection rule of the origin may be factory set, for example. Wherein, this origin can be for example the position that the electric pile of charging of robot is located.
The robot moves indoors and takes pictures of the surrounding environment through the camera. After the origin and the coordinate system are determined, the robot can acquire the coordinates of the current position of the robot based on the sensor of the robot in the moving process of the robot.
The robot extracts the feature points of the shot image according to a preset feature extraction algorithm and acquires the positions of the feature points relative to the robot. The preset feature extraction algorithm may be a SIFT algorithm or the like. SIFT, Scale-invariant feature transform (SIFT), is a description used in the field of image processing. This description has scale invariance and keypoints can be detected in the image.
And the robot calculates the coordinates of the feature points in the coordinate system of the visual SLAM map according to the positions of the feature points relative to the robot and the current coordinates of the robot in the coordinate system of the visual SLAM map.
The robot draws the visual SLAM map at the coordinates in the coordinate system of the visual SLAM map by moving indoors and continuously acquiring the coordinates of the surrounding feature points.
The vision robot of the invention can form a real-time map by adopting the method, and can also form the real-time map by adopting other methods in the prior art.
As shown in fig. 3, determining whether the real-time map data is accurate includes the following steps:
s31: comparing the visual image of the current position with the visual image before the current moment, or comparing each frame of image in the video of the current position, and judging whether the position changes or not to obtain a position change result;
s32: judging whether the position change result is consistent with the track data:
if the map data are consistent, the real-time map data are accurate;
and if the map data are inconsistent, the real-time map data are inaccurate.
In an embodiment of the present invention, a vision device may be employed to obtain a visual image or video of the surroundings of the vision robot. According to the comparison between the visual image of the current position and the visual image before the current moment, the characteristic information in the image can be extracted for comparison, for example, whether the position of the target in the visual image changes or whether the target in the visual image is the same or not can be judged. When the walking speed of the sweeping robot is low and the intervals of the visual images are short, the difference of the visual images is difficult to distinguish, and the visual images with larger time difference can be selected or compared according to the visual images before and after the robot walks for a certain distance along the track of the track recording device. If the characteristic information displayed by the visual image is completely the same, the position of the sweeping robot is judged not to be changed, and if the characteristic information is different, the position information is judged to be changed. The position change includes a distance change and/or an angle change, and for example, the coordinates displayed on the map by the robot may be an x-axis change, a y-axis change, or an angle change. The angle of the robot refers to an angle that the robot rotates during the moving process, for example, the robot rotates 180 degrees in place, and then 180 degrees is the angle of the robot.
Here, the comparison of the feature information of the visual image may be performed by a method in the prior art, and is not limited herein.
The track recording device collects inertial navigation unit (IMU) information and odometer information between two frames of visual images, calculates the relative position and the relative posture of the sweeping robot between the two frames of images corresponding to the inertial navigation unit information by using the inertial navigation unit information and a pre-integration method to obtain the state of the sweeping robot, and compares the state with the state of the sweeping robot obtained through visual image or video judgment. For example, it is determined according to the visual image or video that the position of the sweeping robot is not changed, but the inertial navigation unit (IMU) information and the odometer information show that the sweeping robot is moving all the time, at this time, the robot may hit an obstacle or otherwise cause to turn on site, that is, the position change result is inconsistent with the trajectory data, and the map data uploaded in this period is wrong and needs to be corrected.
In an embodiment of the invention, the WiFi module may be controlled to send the corrected map data to the cloud according to a preset time interval; the time can be preset to meet the requirement of data integrity, and a large amount of data cannot be accumulated in the WiFi module; or when the sweeping robot judges that the obtained real-time map data are inaccurate, the WiFi module is controlled to send the corrected map data to the cloud. The two control methods can be implemented individually or synchronously.
According to the method, a WiFi module of the sweeping robot is provided with a nonvolatile cache region, and the nonvolatile cache region receives sweeping map data transmitted by the robot and performs backup storage; under the condition of good network, the WIFI module normally and continuously sends the received data to the cloud end, then the cloud end carries out recombination, and after the cloud end is confirmed to correctly combine complete map data, the nonvolatile cache area on the WIFI module deletes the map data backed up and stored by the WIFI module; when the network state is not good, even if the cloud end does not receive complete data, the complete map cannot be recombined, and the non-volatile cache area on the WIFI module stores the received data first, the complete map data can be read from the non-volatile cache area to be uploaded to the cloud end after the network is recovered, and the cloud end can be recombined. Guarantee all to have stored complete map data on high in the clouds and the WIFI module like this, one of them at least has stored up-to-date complete map data moreover. In the whole stage that the network state is not good and data cannot be normally transmitted, the WIFI module still receives the sweeping map data from the robot as usual and stores the sweeping map data in the nonvolatile cache region, and old data stored before are continuously replaced, so that the map data stored in the nonvolatile cache region can be ensured to be the latest and complete map data all the time; and if the data written into the nonvolatile cache area is not uploaded to the cloud before the next new cleaning work is started, the data is erased.
The cloud terminal recombines the data transmitted by the robot in a scattered manner to form a sweeping map with complete data, and the sweeping map can be transmitted to the terminal as a complete data packet at one time, so that data combination processing does not need to be carried out at the terminal. Because the CPU processing capacity and the storage capacity of the terminal are stronger, the communication capacity between the terminal and the cloud end is generally guaranteed, so that the one-time complete transmission of complete map data between the terminal and the cloud end is technically difficult, as long as a complete sweeping map is guaranteed on the WIFI module and the cloud end, and the complete sweeping map can be guaranteed to be received when the terminal is started and the robot APP is opened.
After adopting this scheme, the map data that the robot can transmit of sweeping the floor just can be great, wherein can be including cleaning area and shape, complete movement track and barrier information etc. solved effectively because the problem that can not upload the map data of accomplishing that the data processing ability and the bandwidth scheduling problem of the WIFI module on the robot caused, also can prevent simultaneously because the not good problem that appears data loss of network condition.
As shown in fig. 4, the map data transmission method of the present invention further includes:
s5: and controlling a cache region of the WiFi module to delete invalid map data, wherein the invalid map data are sent to the cloud end, and the cloud end correctly combines complete map data.
In one embodiment of the invention, real-time map data and/or modified map data is compressed by converting each grid within the effective area of the map into a bitmap, e.g., each grid is represented by 2 bits (00 for unknown area, 01 for already cleared area, 10 for obstacles and), starting from (start _ x, start _ y), every 4 consecutive grids in the x-direction are converted into a byte. The compressed map occupies less storage space and is easy to store more data. The situation that the data occupies larger and larger memory space, so that subsequent calculation is complex and the robot is difficult to operate is avoided.
As shown in fig. 5, the method for transmitting map data according to the present invention is described from the cloud, and includes the following steps:
t1: receiving real-time map data which are sent by a WiFi module and stored in a cache area of the WiFi module;
t2: receiving correction map data which are sent by a WiFi module and stored in a cache area of the WiFi module;
t3: deleting map data which do not accord with the corrected map data, and combining complete map data according to the corrected map data;
t4: and transmitting the complete map data to a terminal.
The WiFi module transmits the real-time map data or the corrected map data in blocks; the map data that is not matched with the correction map data is deleted and includes the trajectory data. That is, when the map data is erased, the trajectory data is also erased, but when the trajectory data is erased, the map data cannot be erased.
In the map data transmission of the robot, the cloud end is in communication connection with the robot and the terminal respectively. The robot and the cloud end carry out data transmission through a first communication protocol, the cloud end and the terminal carry out data transmission through a second communication protocol, and the first communication protocol and the second communication protocol can be different. When the version of the robot or the terminal is upgraded, the communication protocol supported by the robot and the communication protocol supported by the terminal are different, but the communication protocols of the robot and the terminal are not required to be kept consistent under the condition, and only the cloud end is required to be ensured to be capable of adapting to the respective updated protocols, so that the compatibility of the versions is considered, and the problem that the robot and the terminal cannot be identified mutually after the robot or the terminal is upgraded in the prior art is solved.
The embodiment of the present application also provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed by a processor, the map data transmission method in the embodiment shown above can be implemented.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by instructions associated with a program, and the program may be stored in a computer-readable storage medium, where the above-mentioned storage medium may be a read-only memory, a hard disk, an optical disk, or the like.
As shown in fig. 6, the present invention also provides a map data transmission apparatus, including:
a first obtaining module: acquiring real-time map data and storing the map data in a cache area of the WiFi module;
a first control module: controlling the WiFi module to send the real-time map data to a cloud end;
a second obtaining module: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module;
a second control module: and controlling the WiFi module to send the corrected map data to the cloud.
The vision sweeping robot of the present invention further includes a Processor and a memory, wherein the Processor may be a Central Processing Unit (CPU), or other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The memory may be used to store the computer programs and/or modules, and the processor may implement various functions of the robot by executing or executing the computer programs and/or modules stored in the memory and calling data stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. In addition, the memory may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The terminal runs a client program for controlling the vision sweeping robot, the client program is an APP or a wechat applet, and the terminal is also called User Equipment (UE), which is a device for providing voice and/or data connectivity to a User. Such as a handheld device, a vehicle-mounted device, etc., having a wireless connection function. Common terminals include, for example: mobile phones, tablet computers, notebook computers, palm computers, Mobile Internet Devices (MID), and the like. The terminal can control the sweeping robot through the server. The sweeping robot can also be replaced by other electronic devices or intelligent wearable devices which have functions similar to those of the robot described in the embodiment of the application and can be controlled by the terminal. The cloud end comprises a server, a server cluster, a special router, a wireless modem or a relay station.
After the user purchases the sweeping robot, the APP for controlling the robot can be downloaded by using the terminal. The APP associates the terminal with the robot and uploads the association relation to the cloud. And storing the association relationship between the terminal and the robot in the cloud. One robot may be associated with one terminal, or may be associated with two or more terminals. By associating the terminal with the robot and storing the association relationship between the terminal and the robot, the robot can be controlled by the terminal associated with the robot.
The user can see the outline of the current cleaning area where the vision sweeping robot is located through the terminal, and can operate on the map with the distinguished outline, such as specifying a cleaning place, setting a virtual wall, specifying a cleaning area and specifying a cleaning prohibition area.
Fig. 7 is a real house type diagram.
As shown in fig. 8, the map is obtained after the laser sweeping robot is used for sweeping; the map is basically consistent with the real environment, and the map can be set to the virtual wall, the forbidden zone and other effects desired by the user.
As shown in fig. 9, the real-time map is formed by the vision sweeping robot by transmitting the map data according to the method of the present invention. The method of the invention realizes that the corrected data is continuously uploaded to the cloud end, the corrected data is displayed, and a user can set a cleaning strategy (such as setting a virtual wall, a forbidden zone and the like) according to the displayed map.
As shown in fig. 10, the vision sweeping robot performs map data transmission by using the method of the present invention to form a corrected map, which mainly corrects the boundary obstacles, perfects the display of the obstacles, and has a clear map outline.
The vision sweeping robot can quickly sense the whole house environment, accurately position itself and build a map in real time, update and acquire an accurate and complete map, further realize intelligent navigation and path planning, help the intelligent robot to perform independent calculation reasoning, motion planning and real-time control in the house environment, and establish a visual landmark on the map for position tracking, so that the cleaned area and the uncleaned area are mastered, and the sweeping work is efficiently and quickly completed.
In one embodiment of the invention, the sweeping machine further comprises an infrared inductive or ultrasonic detection device arranged in front of the operation of the sweeping machine, so that the direction can be adjusted before the sweeping machine contacts with an obstacle, and meanwhile, the downward-looking sensor is arranged at the bottom of the sweeping machine, so that the sweeping machine can avoid falling when meeting stairs and steps, has six sweeping modes of left-handed rotation, right-handed rotation, folding line, wall following, column winding and free sweeping, and can be automatically switched according to the ground environment. A Positioning component may also be included for Positioning a current geographic Location to implement navigation or LBS (Location Based Service), which may be a GPS (Global Positioning System), a beidou System, a grave's System, or a galileo System. A power source may also be included to power the various components, and may be ac, dc, disposable or rechargeable, it being understood that when the power source includes a rechargeable battery, the rechargeable battery supports wired or wireless charging, as well as fast charging.
In an embodiment of the invention, the execution main body of the map generation and transmission method of the sweeping robot is a WiFi communication module arranged on the sweeping robot; in other embodiments, the execution body may be in other forms, such as a 5G communication module, a control motherboard of a robot (e.g., an ARM motherboard), and the like.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several equivalent substitutions or obvious modifications can be made without departing from the spirit of the invention, and all the properties or uses are considered to be within the scope of the invention.
Claims (15)
1. A map data transmission method is characterized by comprising the following steps:
s1: acquiring real-time map data and storing the real-time map data in a cache area of a WiFi module, wherein the cache area comprises a volatile cache area and a non-volatile cache area;
s2: controlling the WiFi module to send the real-time map data to a cloud end;
s3: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module;
s4: and controlling the WiFi module to send the corrected map data to the cloud.
2. The map data transmission method according to claim 1, further comprising:
s5: and controlling a cache region of the WiFi module to delete invalid map data, wherein the invalid map data are sent to the cloud end, and the cloud end correctly combines complete map data.
3. The map data transmission method according to claim 1, wherein the real-time map data is acquired based on trajectory data of a walking trajectory.
4. The map data transmission method according to claim 1, wherein the judging whether the real-time map data is accurate comprises the steps of:
s31: comparing the visual image of the current position with the visual image before the current moment, or comparing each frame of image in the video of the current position, and judging whether the position changes or not to obtain a position change result;
s32: judging whether the position change result is consistent with the track data:
if the map data are consistent, the real-time map data are accurate;
and if the map data are inconsistent, the real-time map data are inaccurate.
5. The map data transmission method according to claim 4, wherein the change in position includes a change in distance and/or a change in angle.
6. The map data transmission method according to any one of claims 1 to 5, further comprising compressing the real-time map data and/or the correction map data.
7. The map data transmission method according to claim 6, wherein the real-time map data and/or the modified map data are compressed by converting each grid within an effective area range into a bitmap.
8. The map data transmission method of any one of claims 1-5, wherein the real-time map data or the modified map data are sent to the cloud in blocks.
9. The map data transmission method according to any one of claims 1 to 5, wherein the WiFi module is controlled to transmit the modified map data to the cloud end at preset time intervals;
or when the real-time map data are judged to be inaccurate, the WiFi module is controlled to send the corrected map data to the cloud.
10. The map data transmission method according to any one of claims 1 to 5, wherein the real-time map data or the corrected map data includes a sweeping area and shape, a complete movement trajectory, and obstacle information.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 10.
12. A map data transmission method is characterized by comprising the following steps:
t1: receiving real-time map data which are sent by a WiFi module and stored in a cache area of the WiFi module;
t2: receiving correction map data which are sent by a WiFi module and stored in a cache area of the WiFi module;
t3: deleting map data which do not accord with the corrected map data, and combining complete map data according to the corrected map data;
t4: and transmitting the complete map data to a terminal.
13. The map data transmission method according to claim 12, wherein the WiFi module block-transmits the real-time map data or the modified map data; the map data not matching the correction map data is deleted and includes trajectory data.
14. The map data transmission method of claim 12, wherein the cloud and the WiFi module perform data transmission via a first communication protocol; the cloud and the terminal perform data transmission through a second communication protocol; the first communication protocol is different from the second communication protocol.
15. A map data transmission apparatus, comprising:
a first obtaining module: acquiring real-time map data and storing the map data in a cache area of the WiFi module;
a first control module: controlling the WiFi module to send the real-time map data to a cloud end;
a second obtaining module: acquiring a visual image or video and track data of a current position and judging whether the real-time map data is accurate or not, if not, correcting the real-time map data to obtain corrected map data, and storing the corrected map data in a cache area of the WiFi module;
a second control module: and controlling the WiFi module to send the corrected map data to the cloud.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010099669.6A CN111328017B (en) | 2020-02-18 | 2020-02-18 | Map transmission method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010099669.6A CN111328017B (en) | 2020-02-18 | 2020-02-18 | Map transmission method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111328017A true CN111328017A (en) | 2020-06-23 |
CN111328017B CN111328017B (en) | 2021-05-14 |
Family
ID=71171097
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010099669.6A Active CN111328017B (en) | 2020-02-18 | 2020-02-18 | Map transmission method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111328017B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112153713A (en) * | 2020-10-23 | 2020-12-29 | 珠海格力电器股份有限公司 | Obstacle determination method and apparatus, storage medium, and electronic apparatus |
CN112666943A (en) * | 2020-12-17 | 2021-04-16 | 珠海市一微半导体有限公司 | Cleaning map storage method and system for intelligent terminal, cleaning robot and system |
CN112833912A (en) * | 2020-12-31 | 2021-05-25 | 杭州海康机器人技术有限公司 | V-SLAM map checking method, device and equipment |
CN113749564A (en) * | 2021-09-01 | 2021-12-07 | 深圳市云鼠科技开发有限公司 | Map data drawing method, module, equipment and medium for sweeping robot |
CN114466088A (en) * | 2022-01-07 | 2022-05-10 | 上海黑眸智能科技有限责任公司 | Data transmission method and device for sweeping robot, storage medium and terminal |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103884330A (en) * | 2012-12-21 | 2014-06-25 | 联想(北京)有限公司 | Information processing method, mobile electronic device, guidance device, and server |
CN103901890A (en) * | 2014-04-09 | 2014-07-02 | 中国科学院深圳先进技术研究院 | Outdoor automatic walking device based on family courtyard and system and method for controlling outdoor automatic walking device based on family courtyard |
CN106168805A (en) * | 2016-09-26 | 2016-11-30 | 湖南晖龙股份有限公司 | The method of robot autonomous walking based on cloud computing |
CN107659659A (en) * | 2017-10-14 | 2018-02-02 | 广东宝乐机器人股份有限公司 | A kind of mobile robot and its data transmission method |
CN109528095A (en) * | 2018-12-28 | 2019-03-29 | 深圳市愚公科技有限公司 | It sweeps the floor calibration method, sweeping robot and the mobile terminal of record figure |
CN110260857A (en) * | 2019-07-02 | 2019-09-20 | 北京百度网讯科技有限公司 | Calibration method, device and the storage medium of vision map |
CN110754998A (en) * | 2018-07-26 | 2020-02-07 | 深圳市愚公科技有限公司 | Sweeping record graph transmission method, sweeping robot and mobile terminal |
-
2020
- 2020-02-18 CN CN202010099669.6A patent/CN111328017B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103884330A (en) * | 2012-12-21 | 2014-06-25 | 联想(北京)有限公司 | Information processing method, mobile electronic device, guidance device, and server |
CN103901890A (en) * | 2014-04-09 | 2014-07-02 | 中国科学院深圳先进技术研究院 | Outdoor automatic walking device based on family courtyard and system and method for controlling outdoor automatic walking device based on family courtyard |
CN106168805A (en) * | 2016-09-26 | 2016-11-30 | 湖南晖龙股份有限公司 | The method of robot autonomous walking based on cloud computing |
CN107659659A (en) * | 2017-10-14 | 2018-02-02 | 广东宝乐机器人股份有限公司 | A kind of mobile robot and its data transmission method |
CN110754998A (en) * | 2018-07-26 | 2020-02-07 | 深圳市愚公科技有限公司 | Sweeping record graph transmission method, sweeping robot and mobile terminal |
CN109528095A (en) * | 2018-12-28 | 2019-03-29 | 深圳市愚公科技有限公司 | It sweeps the floor calibration method, sweeping robot and the mobile terminal of record figure |
CN110260857A (en) * | 2019-07-02 | 2019-09-20 | 北京百度网讯科技有限公司 | Calibration method, device and the storage medium of vision map |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112153713A (en) * | 2020-10-23 | 2020-12-29 | 珠海格力电器股份有限公司 | Obstacle determination method and apparatus, storage medium, and electronic apparatus |
CN112666943A (en) * | 2020-12-17 | 2021-04-16 | 珠海市一微半导体有限公司 | Cleaning map storage method and system for intelligent terminal, cleaning robot and system |
CN112833912A (en) * | 2020-12-31 | 2021-05-25 | 杭州海康机器人技术有限公司 | V-SLAM map checking method, device and equipment |
CN112833912B (en) * | 2020-12-31 | 2024-03-05 | 杭州海康机器人股份有限公司 | V-SLAM map verification method, device and equipment |
CN113749564A (en) * | 2021-09-01 | 2021-12-07 | 深圳市云鼠科技开发有限公司 | Map data drawing method, module, equipment and medium for sweeping robot |
CN114466088A (en) * | 2022-01-07 | 2022-05-10 | 上海黑眸智能科技有限责任公司 | Data transmission method and device for sweeping robot, storage medium and terminal |
CN114466088B (en) * | 2022-01-07 | 2023-12-08 | 深圳华芯信息技术股份有限公司 | Data transmission method and device of sweeping robot, storage medium and terminal |
Also Published As
Publication number | Publication date |
---|---|
CN111328017B (en) | 2021-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111328017B (en) | Map transmission method and device | |
CN111202472B (en) | Terminal map construction method of sweeping robot, terminal equipment and sweeping system | |
CN114847803B (en) | Positioning method and device of robot, electronic equipment and storage medium | |
WO2021212926A1 (en) | Obstacle avoidance method and apparatus for self-walking robot, robot, and storage medium | |
AU2019208265B2 (en) | Moving robot, method for controlling the same, and terminal | |
CN110376934B (en) | Cleaning robot, cleaning robot control method and terminal control method | |
CN104858871B (en) | Robot system and self-built map thereof and the method for navigation | |
CN113296495B (en) | Path forming method and device of self-mobile equipment and automatic working system | |
WO2023016188A1 (en) | Map drawing method and apparatus, floor sweeper, storage medium, and electronic apparatus | |
CN104887155A (en) | Intelligent sweeper | |
JP2020077372A (en) | Data collection method and system therefor | |
CN105204505A (en) | Positioning video acquiring and drawing system and method based on sweeping robot | |
CN111220148A (en) | Mobile robot positioning method, system and device and mobile robot | |
WO2020010841A1 (en) | Autonomous vacuum cleaner positioning method and device employing gyroscope calibration based on visual loop closure detection | |
CN211022482U (en) | Cleaning robot | |
CN112204345A (en) | Indoor positioning method of mobile equipment, mobile equipment and control system | |
CN107229274B (en) | Position indication method, terminal device, self-propelled device, and program | |
WO2023025028A1 (en) | Charging method, charging apparatus, and robot | |
CN110134117A (en) | Mobile robot repositioning method, mobile robot and electronic equipment | |
CN108594823A (en) | Control method and control system of sweeping robot | |
CN110895334A (en) | Unmanned sweeper calibration device and method based on laser radar and GPS fusion virtual wall | |
CN111199677B (en) | Automatic work map establishing method and device for outdoor area, storage medium and working equipment | |
WO2023005377A1 (en) | Map building method for robot, and robot | |
CN113503877A (en) | Robot partition map establishing method and device and robot | |
CN113475977A (en) | Robot path planning method and device and robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |