CN111177295A - Image-building ghost eliminating method and device, computer-readable storage medium and robot - Google Patents

Image-building ghost eliminating method and device, computer-readable storage medium and robot Download PDF

Info

Publication number
CN111177295A
CN111177295A CN201911384602.0A CN201911384602A CN111177295A CN 111177295 A CN111177295 A CN 111177295A CN 201911384602 A CN201911384602 A CN 201911384602A CN 111177295 A CN111177295 A CN 111177295A
Authority
CN
China
Prior art keywords
loop
point
key frame
detected
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911384602.0A
Other languages
Chinese (zh)
Inventor
刘洪剑
刘志超
张思民
赵云
赵文恺
庞建新
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201911384602.0A priority Critical patent/CN111177295A/en
Publication of CN111177295A publication Critical patent/CN111177295A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Abstract

The application belongs to the technical field of robots, and particularly relates to a method and a device for eliminating image-building ghosting, a computer-readable storage medium and a robot. In the method, in the process of moving a robot to construct a diagram, loop point detection is executed, wherein the loop point is a position point on a repeated path passed by the robot; respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection; and respectively carrying out loop back operation on each detected loop back point to obtain the map with the ghost eliminated. According to the embodiment of the application, after the first loop point is detected, the loop point is not directly subjected to loop operation as in the prior art, but the loop point is taken as a starting point of reverse detection, loop points corresponding to all historical key frames on a repeated path are respectively detected, and finally loop operation is respectively performed on all detected loop points, so that ghost images possibly appearing on the repeated path are effectively eliminated.

Description

Image-building ghost eliminating method and device, computer-readable storage medium and robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a method and a device for eliminating image-building ghosting, a computer-readable storage medium and a robot.
Background
The construction of the grid map is one of the necessary skills of the mobile robot, and the map is a precondition for the autonomous positioning, path planning and navigation of the robot. A good map can enable the robot to be positioned more accurately, the path planning to be more reasonable and the navigation to be safer. The best quality of a map is evaluated, and the most basic criterion is the presence or absence of ghosts. Ghosting generally occurs in a ring-shaped map, at a graph-building loop, due to the hysteresis of the loop, in the same environment, the lidar may scan twice, and if the maps scanned twice before and after cannot establish a correct position constraint relationship, the two maps are superimposed together to generate a severe ghost.
In the ring map shown in fig. 1, a is a starting point for creating the map, and when the robot returns to the starting point after going around the entire ring, the actual position of the robot is a due to the accumulated error, and a are actually the same point. In order to complete the entire loop, the accumulated error must be eliminated by loop back adjustment. And (4) self-map building, wherein the robot is required to be capable of detecting that the loop-back point is reached and performing loop-back operation. However, since loop detection requires a certain calculation time and confirmation process, the robot usually cannot immediately detect the loop point a at point a, and needs to continue to move forward by a distance, which is a repeated path that the robot moves when performing loop operation, i.e., the distance from a to B in the figure. The robot detects the loop back at point B and performs the loop back operation.
FIG. 2 illustrates prior art loop back operation logic: and (3) matching the real-time sub-map acquired by the robot at the position B with the historical sub-map acquired at the position B, and if the matching is successful, obtaining the optimal position constraint relation of the position B in the position B, wherein the position constraint can ensure that the position B and the position B are almost completely overlapped without double images when being superposed together. The method can only ensure the 'seamless' superposition of the maps at the looping point B and the looping point B, and cannot ensure the 'seamless' connection of the history paths (A-B sections) which are already walked. This results in ghost images that may appear in the a-B segment repeat path after the loop back operation, and the longer the a-B segment distance, the more visible the ghost images.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for eliminating image ghosting, a computer-readable storage medium, and a robot, so as to solve the problems that the existing office application is relatively complicated to operate and poor in user experience.
A first aspect of an embodiment of the present application provides a method for eliminating image ghosting, which may include:
in the process of moving the robot to construct the image, detecting a looping point, wherein the looping point is a position point on a repeated path passed by the robot;
respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection;
and respectively carrying out loop back operation on each detected loop back point to obtain the map with the ghost eliminated.
Further, the detecting loop points corresponding to the respective historical keyframes on the repeated path, with the detected first loop point as a starting point of the reverse detection, includes:
and taking the detected nth loop point as a reference, detecting a loop point corresponding to a target historical key frame, and taking the detected loop point corresponding to the target historical key frame as an n +1 th loop point, wherein the target historical key frame is a previous historical key frame of the historical key frame corresponding to the nth loop point, and n is a positive integer.
Further, the detecting the loop point corresponding to the target historical key frame with the detected nth loop point as a reference includes:
calculating a position difference between a first position point and a second position point, wherein the first position point is a position point determined by a history key frame corresponding to the nth loop point, and the second position point is a position point determined by the target history key frame;
determining the estimated position of the loop point corresponding to the target historical key frame according to the nth loop point and the position difference;
and determining a search range according to the estimated position and a preset deviation threshold, and detecting a loop point corresponding to the target historical key frame in the search range.
Further, the determining the estimated position of the looping point corresponding to the target historical keyframe according to the nth looping point and the position difference includes:
calculating the estimated position of the loop point corresponding to the target historical key frame according to the following formula:
CirclePosn+1’=CirclePosn+Deltan,n+1
wherein, CirclePosnPosition of nth loop point, Deltan,n+1As the position difference, CirclePosn+1' is the estimated position.
Further, the determining a search range according to the estimated position and a preset deviation threshold value comprises:
and determining an area with the estimated position as the center and the deviation threshold as the radius as the search range.
Further, the image ghosting elimination method may further include:
and if the loop point corresponding to the target historical key frame is not detected, ending the reverse detection process.
A second aspect of an embodiment of the present application provides an image ghosting elimination apparatus, which may include:
the robot image construction system comprises a first detection module, a second detection module and a third detection module, wherein the first detection module is used for executing loop point detection in the process of moving a robot to construct an image, and the loop point is a position point on a repeated path passed by the robot;
the second detection module is used for respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection;
and the loop returning operation module is used for respectively performing loop returning operation on each detected loop returning point to obtain the map with the ghost eliminated.
Further, the second detection module may include:
and the reverse detection unit is used for detecting loop points corresponding to a target historical key frame by taking the detected nth loop point as a reference, and taking the detected loop points corresponding to the target historical key frame as the (n + 1) th loop point, wherein the target historical key frame is a previous historical key frame of the historical key frame corresponding to the nth loop point, and n is a positive integer.
Further, the reverse direction detection unit may include:
a position difference calculating subunit, configured to calculate a position difference between a first position point and a second position point, where the first position point is a position point determined by a history key frame corresponding to the nth loop point, and the second position point is a position point determined by the target history key frame;
an estimated position determining subunit, configured to determine, according to the nth looping point and the position difference, an estimated position of a looping point corresponding to the target historical keyframe;
the search range determining subunit is used for determining a search range according to the estimated position and a preset deviation threshold;
and the detection subunit is used for detecting the loop point corresponding to the target historical key frame in the search range.
Further, the estimated position determining subunit is specifically configured to calculate the estimated position of the loop point corresponding to the target historical keyframe according to the following expression:
CirclePosn+1’=CirclePosn+Deltan,n+1
wherein, CirclePosnPosition of nth loop point, Deltan,n+1As the position difference, CirclePosn+1' is the estimated position.
Further, the search range determination subunit is specifically configured to determine, as the search range, an area centered on the estimated position and having the deviation threshold as a radius.
Further, the second detection module may include:
and the detection ending unit is used for ending the reverse detection process if the loop point corresponding to the target historical key frame is not detected.
A third aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and the computer program, when executed by a processor, implements the steps of any of the above-mentioned patterned ghost elimination methods.
A fourth aspect of the embodiments of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of any one of the above-mentioned image ghosting elimination methods when executing the computer program.
A fifth aspect of embodiments of the present application provides a computer program product, which, when run on a robot, causes the robot to perform any of the steps of the method for ghosting elimination described above.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, in the process of moving the robot to construct the image, loop point detection is executed, wherein the loop point is a position point on a repeated path passed by the robot; respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection; and respectively carrying out loop back operation on each detected loop back point to obtain the map with the ghost eliminated. According to the embodiment of the application, after the first loop point is detected, the loop point is not directly subjected to loop operation as in the prior art, but the loop point is taken as a starting point of reverse detection, loop points corresponding to all historical key frames on a repeated path are respectively detected, and finally loop operation is respectively performed on all detected loop points, so that seamless connection of the whole repeated path is guaranteed to the maximum extent, and ghost images possibly appearing on the repeated path are effectively eliminated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic illustration of ghosting occurring on a repeated path;
FIG. 2 is a schematic diagram of a cause of ghost generation;
FIG. 3 is a flowchart of an embodiment of a method for ghost elimination in an embodiment of the present application;
FIG. 4 is a schematic diagram of ghost elimination on a repeat path;
FIG. 5 is a schematic flow chart of detecting a looping point corresponding to a target historical keyframe;
FIG. 6 is a schematic diagram of a mechanism for eliminating ghosts;
FIG. 7 is a block diagram of an embodiment of a ghosting elimination apparatus in an embodiment of the present application;
fig. 8 is a schematic block diagram of a robot in an embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Referring to fig. 3, an embodiment of a method for ghost image cancellation in the present application may include:
and S301, executing loop point detection in the process of drawing construction of the robot.
The loop point is a position point on a repeated path that the robot passes through, and the loop point detection method in this step may be any one of loop point detection methods in the prior art, which is not described herein again in this embodiment of the present application.
Step S302, using the detected first loop point as a starting point of reverse detection, and respectively detecting loop points corresponding to each historical keyframe on the repeated path.
After the first looping point is detected, the embodiment of the application not only adds the position constraint at the looping point, but also further considers each historical key frame on the repeated path to establish the position constraint relationship of the historical key frame.
Taking fig. 4 as an example, the robot detects looping points while traveling on the repeated path, assuming that looping points D are searched at D, and a-b-c-D represent the repeated path that has been traveled at this time, the looping algorithm establishes a position constraint relationship between D-D. And then, establishing a position constraint relation among a-A, B-B and C-C to ensure that the four key frames of a-B-C-D and A-B-C-D are superposed together without double images.
In a specific implementation of the embodiment of the present application, a loop point corresponding to a target historical keyframe is detected with reference to a detected nth loop point, and the detected loop point corresponding to the target historical keyframe is taken as an n +1 th loop point, where the target historical keyframe is a previous historical keyframe of the historical keyframe corresponding to the nth loop point, n is a positive integer with an initial value of 1, after the nth +1 th loop point is detected with reference to the nth loop point, n is increased by a count unit, that is, n is n +1, the above process is re-executed, iteration is repeated, that is, with reference to the detected 1 st loop point, a 2 nd loop point is continuously obtained by reverse detection, then with reference to the detected 2 nd loop point, a 3 rd loop point is continuously obtained by reverse detection, and then with reference to the detected 3 rd loop point, and continuing to reversely detect to obtain the 4 th loop point, … …, and so on, and ending the reverse detection process until the loop point corresponding to the target historical key frame is not detected.
Regarding the keyframes mentioned in the embodiments of the present application, taking a situation that the robot uses a laser radar sensor to construct a map as an example, after the robot starts navigation, the robot may use a first frame of laser data frame as a keyframe, and when a new laser data frame is subsequently acquired, compare the new laser data frame with a latest keyframe, if a position difference between the two frames is greater than a preset position difference threshold or an angle difference between the two frames is greater than a preset angle threshold, determine that the new laser data frame is a new keyframe, otherwise, if the position difference between the two frames is less than or equal to the position difference threshold and the angle difference between the two frames is less than or equal to the angle threshold, determine that the new laser data frame is not a new keyframe. The specific values of the position difference threshold and the angle threshold may be set according to actual conditions, which is not specifically limited in the embodiment of the present application. Preferably, if the time difference between the acquisition time of the new laser data frame and the acquisition time of the key frame is greater than a preset time threshold, it may also be determined that the new laser data frame is a new key frame. The specific value of the time threshold may be set according to an actual situation, which is not specifically limited in the embodiment of the present application. And continuously repeating the processes to obtain each key frame in the image building process in sequence.
In a specific implementation of the embodiment of the present application, the process of detecting a loop point corresponding to a target historical key frame by using the detected nth loop point as a reference may specifically include the steps shown in fig. 5:
step S3021, calculating a position difference between the first position point and the second position point.
The first position point is a position point determined by a history key frame corresponding to the nth loop point, and the second position point is a position point determined by the target history key frame.
And step S3022, determining the estimated position of the loop point corresponding to the target historical key frame according to the nth loop point and the position difference.
In a specific implementation of the embodiment of the present application, the estimated position of the loop point corresponding to the target historical keyframe may be calculated according to the following formula:
CirclePosn+1’=CirclePosn+Deltan,n+1
wherein, CirclePosnPosition of nth loop point, Deltan,n+1As the position difference, CirclePosn+1' is the estimated position.
Step S3023, determining a search range according to the estimated position and a preset deviation threshold, and detecting a loop point corresponding to the target historical key frame in the search range.
In a specific implementation of the embodiment of the present application, an area with the estimated position as a center and the deviation threshold as a radius may be determined as the search range, and a loop point corresponding to the target historical keyframe is detected in the search range, where the deviation threshold may be set according to an actual situation, for example, it may be set to 1 meter, 2 meters, or another value, and preferably, the deviation threshold may be set to 0.5 meter.
Still taking fig. 4 as an example, at the loop back point, due to accumulated errors, the distance between the robot and the real loop back position D may be relatively far, e.g. 4 meters, represented in fig. 4 by the distance between D-D. In order to search the loop point D, the robot needs to search a wide range for a position, which is quite large in calculation amount, and if the historical key frames a, b, c all search the corresponding loop points by this method, it is equivalent to perform multiple loop searches, which results in a loop process lasting too long. In order to quickly search the loop positions of the historical key frames a, b and c, the relative position relationship between the historical key frames is fully utilized in the embodiment of the application. Taking C key frame as an example, a relative position relation delta _ cd between C and D is calculated first, after loop back search, D loop back is determined and then is coincided with D, and as delta _ cd is a fixed relative position relation between C and D, it can be predicted that the approximate position after C loop back is C' ═ D + delta _ cd.
It should be noted that C 'obtained by the above process only predicts the possible position after C-looping according to the position relationship between C-d, and this position is not matched with its true position C, and if C' is taken as the position after C-looping, it is also not guaranteed that there is no ghost after superposition. However, the position of C 'is already close to the real position C, and in this case, the matching range can be reduced, and the real position C can be searched only by searching in a small range near C', so that the relative position relationship between C and C can be established. Compared with the first loop search, the search range is greatly reduced, so the search time is also greatly reduced.
By repeatedly executing the above processes, the position constraint relationship between the remaining historical key frames a-a, B-B in fig. 4 can be established, thereby completing the detection process of the looping point corresponding to each historical key frame on the repeated path.
And step S303, performing loop operation on each detected loop point to obtain the map without the ghost.
After the position constraint relations of all historical key frames on the repeated path are established, loop returning operation can be carried out, and no ghost image exists on the repeated path after loop returning.
Fig. 6 shows a map of a loop over a repeated path after optimization by an embodiment of the application. As can be seen in the figure, the ghost phenomenon produced by the previous algorithm loop has disappeared.
In summary, in the embodiment of the present application, in the process of moving the robot to construct the image, loop point detection is performed, where the loop point is a position point on a repeated path through which the robot passes; respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection; and respectively carrying out loop back operation on each detected loop back point to obtain the map with the ghost eliminated. According to the embodiment of the application, after the first loop point is detected, the loop point is not directly subjected to loop operation as in the prior art, but the loop point is taken as a starting point of reverse detection, loop points corresponding to all historical key frames on a repeated path are respectively detected, and finally loop operation is respectively performed on all detected loop points, so that seamless connection of the whole repeated path is guaranteed to the maximum extent, and ghost images possibly appearing on the repeated path are effectively eliminated.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 7 is a block diagram of an embodiment of a ghosting image elimination apparatus according to an embodiment of the present application.
In this embodiment, an image-creating ghost eliminating apparatus may include:
a first detection module 701, configured to perform looping point detection in a process of a robot performing a mapping process, where a looping point is a position point on a repeated path that the robot passes through;
a second detecting module 702, configured to use the detected first looping point as a starting point of reverse detection, and respectively detect looping points corresponding to each historical key frame on the repeated path;
a loop operation module 703, configured to perform loop operation on each detected loop point respectively, so as to obtain a map with the ghost eliminated.
Further, the second detection module may include:
and the reverse detection unit is used for detecting loop points corresponding to a target historical key frame by taking the detected nth loop point as a reference, and taking the detected loop points corresponding to the target historical key frame as the (n + 1) th loop point, wherein the target historical key frame is a previous historical key frame of the historical key frame corresponding to the nth loop point, and n is a positive integer.
Further, the reverse direction detection unit may include:
a position difference calculating subunit, configured to calculate a position difference between a first position point and a second position point, where the first position point is a position point determined by a history key frame corresponding to the nth loop point, and the second position point is a position point determined by the target history key frame;
an estimated position determining subunit, configured to determine, according to the nth looping point and the position difference, an estimated position of a looping point corresponding to the target historical keyframe;
the search range determining subunit is used for determining a search range according to the estimated position and a preset deviation threshold;
and the detection subunit is used for detecting the loop point corresponding to the target historical key frame in the search range.
Further, the estimated position determining subunit is specifically configured to calculate the estimated position of the loop point corresponding to the target historical keyframe according to the following expression:
CirclePosn+1’=CirclePosn+Deltan,n+1
wherein, CirclePosnPosition of nth loop point, Deltan,n+1As the position difference, CirclePosn+1' is the estimated position.
Further, the search range determination subunit is specifically configured to determine, as the search range, an area centered on the estimated position and having the deviation threshold as a radius.
Further, the second detection module may include:
and the detection ending unit is used for ending the reverse detection process if the loop point corresponding to the target historical key frame is not detected.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, modules and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Fig. 8 shows a schematic block diagram of a robot provided in an embodiment of the present application, and only a part related to the embodiment of the present application is shown for convenience of explanation.
As shown in fig. 8, the robot 8 of this embodiment includes: a processor 80, a memory 81 and a computer program 82 stored in said memory 81 and executable on said processor 80. The processor 80, when executing the computer program 82, implements the steps in the above-described embodiments of the image ghosting elimination method, such as the steps S301 to S303 shown in fig. 3. Alternatively, the processor 80, when executing the computer program 82, implements the functions of each module/unit in the above-mentioned device embodiments, for example, the functions of the modules 701 to 703 shown in fig. 7.
Illustratively, the computer program 82 may be partitioned into one or more modules/units that are stored in the memory 81 and executed by the processor 80 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 82 in the robot 8.
Those skilled in the art will appreciate that fig. 8 is merely an example of a robot 8 and does not constitute a limitation of robot 8 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., robot 8 may also include input and output devices, network access devices, buses, etc.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may be an internal storage unit of the robot 8, such as a hard disk or a memory of the robot 8. The memory 81 may also be an external storage device of the robot 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the robot 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the robot 8. The memory 81 is used for storing the computer program and other programs and data required by the robot 8. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/robot are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A method for eliminating image ghosting is characterized by comprising the following steps:
in the process of moving the robot to construct the image, detecting a looping point, wherein the looping point is a position point on a repeated path passed by the robot;
respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection;
and respectively carrying out loop back operation on each detected loop back point to obtain the map with the ghost eliminated.
2. The method for eliminating image ghosting according to claim 1, wherein the detecting loop points corresponding to the respective historical keyframes on the repeated path respectively with the detected first loop point as a starting point of the reverse detection comprises:
and taking the detected nth loop point as a reference, detecting a loop point corresponding to a target historical key frame, and taking the detected loop point corresponding to the target historical key frame as an n +1 th loop point, wherein the target historical key frame is a previous historical key frame of the historical key frame corresponding to the nth loop point, and n is a positive integer.
3. The method for eliminating image ghosting according to claim 2, wherein the detecting the loop point corresponding to the target historical key frame by taking the detected nth loop point as a reference comprises:
calculating a position difference between a first position point and a second position point, wherein the first position point is a position point determined by a history key frame corresponding to the nth loop point, and the second position point is a position point determined by the target history key frame;
determining the estimated position of the loop point corresponding to the target historical key frame according to the nth loop point and the position difference;
and determining a search range according to the estimated position and a preset deviation threshold, and detecting a loop point corresponding to the target historical key frame in the search range.
4. The method according to claim 3, wherein the determining the estimated position of the loop point corresponding to the target historical keyframe from the nth loop point and the position difference comprises:
calculating the estimated position of the loop point corresponding to the target historical key frame according to the following formula:
CirclePosn+1’=CirclePosn+Deltan,n+1
wherein, CirclePosnPosition of nth loop point, Deltan,n+1As the position difference, CirclePosn+1' is the estimated position.
5. The method of claim 3, wherein the determining a search range based on the estimated position and a preset bias threshold comprises:
and determining an area with the estimated position as the center and the deviation threshold as the radius as the search range.
6. The image ghosting elimination method of any of claims 2 to 5, further comprising:
and if the loop point corresponding to the target historical key frame is not detected, ending the reverse detection process.
7. An image ghosting elimination apparatus, comprising:
the robot image construction system comprises a first detection module, a second detection module and a third detection module, wherein the first detection module is used for executing loop point detection in the process of moving a robot to construct an image, and the loop point is a position point on a repeated path passed by the robot;
the second detection module is used for respectively detecting loop points corresponding to all historical key frames on the repeated path by taking the detected first loop point as a starting point of reverse detection;
and the loop returning operation module is used for respectively performing loop returning operation on each detected loop returning point to obtain the map with the ghost eliminated.
8. The image ghosting elimination apparatus of claim 7, wherein the second detection module comprises:
and the reverse detection unit is used for detecting loop points corresponding to a target historical key frame by taking the detected nth loop point as a reference, and taking the detected loop points corresponding to the target historical key frame as the (n + 1) th loop point, wherein the target historical key frame is a previous historical key frame of the historical key frame corresponding to the nth loop point, and n is a positive integer.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of image ghosting elimination as set forth in any of claims 1 to 6.
10. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, carries out the steps of the method of image ghosting elimination as claimed in any of claims 1 to 6.
CN201911384602.0A 2019-12-28 2019-12-28 Image-building ghost eliminating method and device, computer-readable storage medium and robot Pending CN111177295A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911384602.0A CN111177295A (en) 2019-12-28 2019-12-28 Image-building ghost eliminating method and device, computer-readable storage medium and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911384602.0A CN111177295A (en) 2019-12-28 2019-12-28 Image-building ghost eliminating method and device, computer-readable storage medium and robot

Publications (1)

Publication Number Publication Date
CN111177295A true CN111177295A (en) 2020-05-19

Family

ID=70655835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911384602.0A Pending CN111177295A (en) 2019-12-28 2019-12-28 Image-building ghost eliminating method and device, computer-readable storage medium and robot

Country Status (1)

Country Link
CN (1) CN111177295A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111795687A (en) * 2020-06-29 2020-10-20 深圳市优必选科技股份有限公司 Robot map updating method and device, readable storage medium and robot
CN112085788A (en) * 2020-08-03 2020-12-15 深圳市优必选科技股份有限公司 Loop detection method, loop detection device, computer readable storage medium and mobile device
CN112100298A (en) * 2020-08-17 2020-12-18 深圳市优必选科技股份有限公司 Drawing establishing method and device, computer readable storage medium and robot
CN112146662A (en) * 2020-09-29 2020-12-29 炬星科技(深圳)有限公司 Method and device for guiding map building and computer readable storage medium
CN113263500A (en) * 2021-05-25 2021-08-17 深圳市优必选科技股份有限公司 Robot autonomous operation method and device, robot and storage medium
CN112085788B (en) * 2020-08-03 2024-04-19 优必康(青岛)科技有限公司 Loop detection method and device, computer readable storage medium and mobile device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109785387A (en) * 2018-12-17 2019-05-21 中国科学院深圳先进技术研究院 Winding detection method, device and the robot of robot
CN110472585A (en) * 2019-08-16 2019-11-19 中南大学 A kind of VI-SLAM closed loop detection method based on inertial navigation posture trace information auxiliary

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109785387A (en) * 2018-12-17 2019-05-21 中国科学院深圳先进技术研究院 Winding detection method, device and the robot of robot
CN110472585A (en) * 2019-08-16 2019-11-19 中南大学 A kind of VI-SLAM closed loop detection method based on inertial navigation posture trace information auxiliary

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
余辉亮: "基于视觉的实时三维建模与定位研究" *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111795687A (en) * 2020-06-29 2020-10-20 深圳市优必选科技股份有限公司 Robot map updating method and device, readable storage medium and robot
CN112085788A (en) * 2020-08-03 2020-12-15 深圳市优必选科技股份有限公司 Loop detection method, loop detection device, computer readable storage medium and mobile device
CN112085788B (en) * 2020-08-03 2024-04-19 优必康(青岛)科技有限公司 Loop detection method and device, computer readable storage medium and mobile device
CN112100298A (en) * 2020-08-17 2020-12-18 深圳市优必选科技股份有限公司 Drawing establishing method and device, computer readable storage medium and robot
CN112100298B (en) * 2020-08-17 2024-04-19 深圳市优必选科技股份有限公司 Picture construction method and device, computer readable storage medium and robot
CN112146662A (en) * 2020-09-29 2020-12-29 炬星科技(深圳)有限公司 Method and device for guiding map building and computer readable storage medium
CN112146662B (en) * 2020-09-29 2022-06-10 炬星科技(深圳)有限公司 Method and device for guiding map building and computer readable storage medium
CN113263500A (en) * 2021-05-25 2021-08-17 深圳市优必选科技股份有限公司 Robot autonomous operation method and device, robot and storage medium

Similar Documents

Publication Publication Date Title
US10852139B2 (en) Positioning method, positioning device, and robot
CN110561423B (en) Pose transformation method, robot and storage medium
CN111177295A (en) Image-building ghost eliminating method and device, computer-readable storage medium and robot
CN110850872A (en) Robot inspection method and device, computer readable storage medium and robot
CN114663528A (en) Multi-phase external parameter combined calibration method, device, equipment and medium
CN107784671B (en) Method and system for visual instant positioning and drawing
JP5717875B2 (en) Positioning method
CN110471409B (en) Robot inspection method and device, computer readable storage medium and robot
CN108520543B (en) Method, equipment and storage medium for optimizing relative precision map
CN111123934A (en) Trajectory evaluation method, trajectory evaluation device, and mobile robot
CN111145634B (en) Method and device for correcting map
CN113459088B (en) Map adjustment method, electronic device and storage medium
CN110109165B (en) Method and device for detecting abnormal points in driving track
CN112097772B (en) Robot and map construction method and device thereof
WO2024007807A1 (en) Error correction method and apparatus, and mobile device
CN112917467B (en) Robot positioning and map building method and device and terminal equipment
CN110208761B (en) Two-channel monopulse system sectional phase correction method
CN114299192B (en) Method, device, equipment and medium for positioning and mapping
CN113203424B (en) Multi-sensor data fusion method and device and related equipment
CN115388878A (en) Map construction method and device and terminal equipment
CN114814875A (en) Robot positioning and image building method and device, readable storage medium and robot
CN113297259A (en) Robot and environment map construction method and device thereof
CN116295466A (en) Map generation method, map generation device, electronic device, storage medium and vehicle
CN108548536A (en) The dead reckoning method of unmanned intelligent robot
CN112950709A (en) Pose prediction method, pose prediction device and robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination