CN114088085A - Robot position determining method and device, electronic equipment and storage medium - Google Patents

Robot position determining method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114088085A
CN114088085A CN202111392025.7A CN202111392025A CN114088085A CN 114088085 A CN114088085 A CN 114088085A CN 202111392025 A CN202111392025 A CN 202111392025A CN 114088085 A CN114088085 A CN 114088085A
Authority
CN
China
Prior art keywords
robot
current position
sensor
function
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111392025.7A
Other languages
Chinese (zh)
Other versions
CN114088085B (en
Inventor
姚秀勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anker Innovations Co Ltd
Original Assignee
Anker Innovations Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anker Innovations Co Ltd filed Critical Anker Innovations Co Ltd
Priority to CN202111392025.7A priority Critical patent/CN114088085B/en
Publication of CN114088085A publication Critical patent/CN114088085A/en
Application granted granted Critical
Publication of CN114088085B publication Critical patent/CN114088085B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the disclosure discloses a position determining method and device for a robot, electronic equipment and a storage medium. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor, the method comprising: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable, and the switch variable is used for controlling the magnitude of a function value of a relation function; and determining next position information corresponding to the acquired current position information based on the objective function. The disclosed embodiments control the magnitude of the function value of the relation function by switching the variables, so that the accuracy of the position determination of the robot can be improved when at least one of the odometer sensor and the optical flow sensor provides error information.

Description

Robot position determining method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of positioning, and in particular, to a method and an apparatus for determining a position of a robot, an electronic device, and a storage medium.
Background
In the prior art, various sensors are generally provided in a robot, for example, a odometer sensor, an optical flow sensor, a loop detection sensor, and the like.
Through the arranged sensors, the positioning, path planning, position prediction and the like of the robot can be realized.
However, in the process of positioning, path planning and position prediction of the robot, the reading information of the sensors is highly dependent, and once the sensors provide wrong information, the accuracy of the position determination of the robot is seriously affected.
Disclosure of Invention
In view of the above, to solve the technical problems or some technical problems, embodiments of the present disclosure provide a method and an apparatus for determining a position of a robot, an electronic device, and a storage medium.
In a first aspect, an embodiment of the present disclosure provides a method for determining a position of a robot, where the robot includes an object sensor, and the object sensor includes at least one of an odometer sensor and an optical flow sensor, and the method includes:
acquiring current position information of the robot and reading information of the target sensor;
constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable for controlling the magnitude of a function value of a relation function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the reading information of the target sensor;
and determining next position information corresponding to the acquired current position information based on the objective function.
Optionally, in the method according to any embodiment of the present disclosure, the constructing an objective function based on the current position information and the reading information includes:
and constructing an objective function based on the current position information, the reading information and the uncertainty of the reading information.
Optionally, in the method of any embodiment of the present disclosure, the target sensor includes an odometer sensor and an optical flow sensor; and
the constructing an objective function based on the current position information and the reading information includes:
constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switch variable and the second switch variable;
the first switching variable controls a magnitude of a function value of a first relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the odometer sensor, and the second switching variable controls a magnitude of a function value of a second relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the optical flow sensor.
Optionally, in the method according to any embodiment of the present disclosure, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
Optionally, in the method according to any of the embodiments of the present disclosure, the robot further includes a loop detection sensor; and
the constructing an objective function based on the current position information and the reading information includes:
constructing a target function based on the current position information, the reading information and the reading information of the loop detection sensor;
the objective function further includes a third switching variable for controlling a magnitude of a function value of a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor.
Optionally, in the method according to any embodiment of the present disclosure, the objective function further includes a constraint term of the third switching variable.
Optionally, in a method according to any embodiment of the present disclosure, the determining, based on the objective function, the next location information corresponding to the obtained current location information includes:
and calculating next position information corresponding to the current position information by taking the minimum value obtained by the target function as a target.
In a second aspect, an embodiment of the present disclosure provides a position determining apparatus for a robot, the robot including an object sensor, the object sensor including at least one of an odometer sensor and an optical flow sensor, the apparatus including:
an acquisition unit configured to acquire current position information of the robot and reading information of the target sensor;
a construction unit configured to construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relation function representing a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the target sensor;
and the determining unit is configured to determine the next position information corresponding to the acquired current position information based on the objective function.
Optionally, in an apparatus according to any embodiment of the present disclosure, the building unit includes:
a first constructing subunit configured to construct an objective function based on the current position information, the reading information, and the uncertainty of the reading information.
Optionally, in the apparatus of any embodiment of the present disclosure, the target sensor includes an odometer sensor and an optical flow sensor; and
the above-mentioned building element includes:
a second construction subunit configured to construct an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable, and the second switching variable;
the first switching variable controls a magnitude of a function value of a first relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the odometer sensor, and the second switching variable controls a magnitude of a function value of a second relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the optical flow sensor.
Optionally, in the apparatus according to any embodiment of the present disclosure, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
Optionally, in the apparatus according to any embodiment of the present disclosure, the robot further includes a loop detection sensor; and
the above-mentioned building element includes:
a third constructing subunit configured to construct an objective function based on the current position information, the reading information, and reading information of the loop detection sensor;
the objective function further includes a third switching variable, where the third switching variable is used to control a magnitude of a function value of a third relation function, and the third relation function represents a correspondence between a current position of the robot and a loop position of the robot calculated based on reading information of the loop detection sensor.
Optionally, in the apparatus according to any embodiment of the present disclosure, the objective function further includes a constraint term of the third switching variable.
Optionally, in an apparatus according to any embodiment of the present disclosure, the determining unit includes:
and a calculation subunit configured to calculate next position information corresponding to the current position information, with the minimum value obtained by the objective function as a target.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including:
a memory for storing a computer program;
a processor for executing the computer program stored in the memory, and when the computer program is executed, the method of any embodiment of the position determination method of the robot of the first aspect of the present disclosure is realized.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, which when executed by a processor, implements the method of any of the embodiments of the method for determining a position of a robot as described in the first aspect above.
In a fifth aspect, embodiments of the present disclosure provide a computer program comprising computer readable code which, when run on a device, causes a processor in the device to execute instructions for implementing the steps in the method according to any one of the embodiments of the method for determining a position of a robot as described above in the first aspect.
In the method for determining a position of a robot according to the above embodiment of the present disclosure, the robot includes an object sensor, and the object sensor includes at least one of an odometer sensor and an optical flow sensor. Accordingly, current position information of the robot and reading information of the target sensor are acquired, then, an objective function is constructed based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the target sensor, and finally, next position information corresponding to the acquired current position information is determined based on the objective function. The embodiment of the disclosure controls the magnitude of the function value of the relation function through the switch variable, so that the magnitude of the function value of the relation function can be limited when at least one of the odometer sensor and the optical flow sensor provides wrong reading information, thereby reducing the influence degree of the sensor providing wrong information on the position determination, and further improving the accuracy of the position determination of the robot.
The technical solution of the present disclosure is further described in detail by the accompanying drawings and embodiments.
Drawings
Other features, objects and advantages of the present disclosure will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 is an exemplary system architecture diagram of a position determining method of a robot or a position determining apparatus of a robot according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a position determining method of a robot according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of one application scenario for the embodiment of FIG. 2;
fig. 4 is a flowchart of another method for determining a position of a robot according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a position determining apparatus of a robot according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of parts and steps, numerical expressions, and values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
It will be understood by those within the art that the terms "first," "second," and the like in the embodiments of the present disclosure are used merely to distinguish one object, step, device, or module from another object, and do not denote any particular technical meaning or logical order therebetween.
It is also understood that in embodiments of the present disclosure, "a plurality" may refer to two or more and "at least one" may refer to one, two or more.
It is also to be understood that any reference to any component, data, or structure in the embodiments of the disclosure, may be generally understood as one or more, unless explicitly defined otherwise or stated otherwise.
In addition, the term "and/or" in the present disclosure is only one kind of association relationship describing an associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" in the present disclosure generally indicates that the former and latter associated objects are in an "or" relationship.
It should also be understood that the description of the various embodiments of the present disclosure emphasizes the differences between the various embodiments, and the same or similar parts may be referred to each other, so that the descriptions thereof are omitted for brevity.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
It should be noted that, in the present disclosure, the embodiments and the features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a diagram of an exemplary system architecture of a position determining method of a robot or a position determining apparatus of a robot according to an embodiment of the present disclosure.
As shown in fig. 1, the system architecture 100 may include a robot 101. Optionally, the system architecture 100 may further include a network 103 and a server 102, the network 103 may provide a medium for a communication link between the robot 101 and the server 102. Network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The robot 101 and the server 102 may interact with each other via the network 103 to receive or transmit data and the like. Here, at least one of the robot 101 and the server 102 may be an execution subject of each step in the position determination method of the robot described in the embodiment of the present disclosure. For example, the position determination method of the robot described in the embodiment of the present disclosure may be performed by the robot 101, by the server 102, or by the robot 101 and the server 102 cooperating with each other.
The execution main body of the method for determining the position of the robot provided by the embodiment of the present disclosure may be hardware or software, and is not limited specifically herein.
Further, the robot 101 may include at least one of an odometer sensor and an optical flow sensor, and optionally, the robot 101 may further include a loop detection sensor.
It should be understood that the number of robots, servers, and networks in fig. 1 is merely illustrative. There may be any number of robots, servers, and networks, as desired for the implementation. Furthermore, when the execution subject of the position determination method of the robot provided by the embodiment of the present disclosure does not need to interact with other electronic devices, the system architecture 100 described above may only include the execution subject of the position determination method of the robot, and not include other electronic devices and networks in addition thereto. For example, the system architecture 100 described above may include only robots.
Fig. 2 shows a flowchart 200 of a position determination method for a robot according to an embodiment of the present disclosure. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The robot position determining method comprises the following steps:
step 201, obtaining the current position information of the robot and the reading information of the target sensor.
In this embodiment, an executing body of the position determining method of the robot (for example, the robot shown in fig. 1) may acquire current position information of the robot and reading information of the target sensor.
Wherein the current position information may characterize the position of the robot.
Further, in the case where the target sensor includes only the odometer sensor, the reading information of the target sensor may be the reading information of the odometer sensor; in the case where the target sensor includes only an optical flow sensor, the reading information of the target sensor may be the reading information of the optical flow sensor; in the case where the target sensor includes an odometer sensor and an optical-flow sensor, the reading information of the target sensor may include reading information of the odometer sensor and reading information of the optical-flow sensor. The reading information may be a reading of the target sensor or data calculated based on the reading of the target sensor, for example, the reading information may be an integral of the reading of the target sensor at a plurality of time points.
Step 202, constructing an objective function based on the current position information and the reading information.
In this embodiment, the executing body may construct an objective function based on the current position information and the reading information.
The objective function includes a switching variable for controlling a magnitude of a function value of a relation function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the objective sensor.
Here, the current position of the robot and the next position of the robot are relative terms, and for example, if the position where the robot is located at the current time is taken as the current position of the robot, the next position of the robot may be the position where the robot is located after a predetermined time from the current time; if the position of the robot in the preset time before the current moment is taken as the current position of the robot, the next position of the robot can be the position of the robot in the current moment; if the position of the robot in the preset time after the current time is taken as the current position of the robot, the next position of the robot may be the position of the robot in 2 preset times after the current time, and so on. That is, in the embodiment of the present disclosure, it is not limited that the current position of the robot only represents the position where the robot is located at the current time.
In some optional implementation manners of this embodiment, the executing main body may execute the step 202 in the following manner: and constructing an objective function based on the current position information, the reading information and the uncertainty of the reading information.
As an example, the objective function h (x)*,s*) Can be expressed as the following formula (1):
Figure BDA0003364583260000101
wherein x is*Including xi,xi+1。s*Included
Figure BDA0003364583260000102
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. i is used to identify the time of day.
Figure BDA0003364583260000103
Characterizing the switching variables. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000104
f1() The relationship function is characterized and may be a non-linear function. u. ofoiThe integral of the reading of the target sensor at a plurality of times (current time i and before). SigmaoiAn uncertainty characterizing a reading of the target sensor. r isijRepresenting a priori data with an initial value of 1.
As yet another example, the objective function h (x)*,s*) It can also be expressed as the following formula (2):
Figure BDA0003364583260000111
wherein x is*Including xi,xi+1。s*Included
Figure BDA0003364583260000112
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day.
Figure BDA0003364583260000113
Characterizing the switching variables. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000114
f1() is a relational function, which can be a non-linear function. u. uoiThe integral of the reading of the target sensor at a plurality of times (current time i and before). SigmaoiAnd
Figure BDA0003364583260000115
respectively, representing the uncertainty. SigmaoiAn uncertainty characterizing a reading of the target sensor.
Figure BDA0003364583260000116
Characterizes the uncertainty of the switching variable. r isijRepresenting a priori data with an initial value of 1.
In some optional implementations of this embodiment, the target sensor includes an odometer sensor and an optical flow sensor. On this basis, the execution main body may execute the step 202 in the following manner: and constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switch variable and the second switch variable.
The first switching variable controls the magnitude of a function value of a first relational function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the read information of the odometer sensor, and the second switching variable controls the magnitude of a function value of a second relational function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the read information of the optical flow sensor.
As an example, the objective function h (x)*,s*) Can be expressed as the following equation (3):
Figure BDA0003364583260000121
wherein x is*Including xi,xi+1。s*Included
Figure BDA0003364583260000122
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000123
Respectively characterizing the first switch variable and the second switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000124
The relation function includes1()、f2()。f1()、f2() May each be a non-linear function. f. of1() The first relation function is a first relation function which represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. u. ofoi,ufi respectively represent the integral of the reading of the odometer sensor at a plurality of times (current time i and before) and the reading of the optical flow sensor at a plurality of timesIntegration of each time instant. Sigmaoi,∑fiAnd
Figure 12
respectively, representing the uncertainty. Sigmaoi,∑fiRespectively, characterizing the uncertainty of the reading of the odometer sensor, the uncertainty of the reading of the optical flow sensor.
Figure 11
Respectively characterizing the uncertainty of the first switching variable and the uncertainty of the second switching variable.
In some application scenarios in the above alternative implementation manner, the objective function further includes a constraint term of the first switch variable and a constraint term of the second switch variable.
As an example, the objective function h (x)*,s*) Can be expressed as the following equation (4):
Figure BDA0003364583260000133
wherein x is*Including xi, xi+1。s*Included
Figure BDA0003364583260000134
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000135
Respectively characterizing the first switch variable and the second switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000136
The relation function includes1()、f2()。f1()、f2() May be a non-linear function, respectively. f. of1() As a function of a first relation, a function of the first relationThe number represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. u. ofoi,ufiRespectively, the integrals of the readings of the odometer sensor at a plurality of times (current time i and before), and the integrals of the readings of the optical flow sensor at a plurality of times. Sigmaoi,∑fiAnd
Figure 10
respectively, representing the uncertainty. Sigmaoi,∑fiRespectively, characterizing the uncertainty of the reading of the odometer sensor, the uncertainty of the reading of the optical flow sensor.
Figure 9
Respectively characterizing the uncertainty of the first switching variable and the uncertainty of the second switching variable. The constraint term of the first switching variable is
Figure BDA0003364583260000143
The constraint term of the second switching variable is
Figure BDA0003364583260000144
rijRepresenting a priori data with an initial value of 1.
As yet another example, the objective function h (x)*,s*) It can also be expressed as the following formula (5):
Figure BDA0003364583260000151
wherein x is*Including xi,xi+1,xj。s*Included
Figure BDA0003364583260000152
xiCharacterizing the current of a robotLocation. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. x is the number ofjCharacterization and xiThe proximal position, i.e. the loop back position. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000153
Respectively representing a first switch variable, a second switch variable and a third switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000154
The relation function includes f1()、f2()、f3(). The relationship function may be a non-linear function. f. of1() The first relation function is a first relation function which represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And a second relation function, wherein the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. f. of3() And a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor. u. ofoi,ufi,uijThe integral of the reading of the odometer sensor at a plurality of times (current time i and before), the integral of the reading of the optical flow sensor at a plurality of times, and the integral of the reading of the loop detection sensor at a plurality of times are respectively characterized. Sigmaoi,∑fi,∑cijAnd
Figure 8
respectively, representing the uncertainty. Sigmaoi,∑fi,∑cijRespectively, characterizing the uncertainty of the reading of the odometer sensor, the uncertainty of the reading of the optical flow sensor, the uncertainty of the reading of the loop detection sensor.
Figure 7
And respectively representing the uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable. r isijRepresenting a priori data with an initial value of 1.
In some optional implementation manners of this embodiment, on the basis that the robot further includes a loop detection sensor, the executing body may execute the step 202 in the following manner: and constructing an objective function based on the current position information, the reading information and the reading information of the loop detection sensor.
The objective function further includes a third switching variable for controlling a magnitude of a function value of a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor.
As an example, the objective function h (x)*,s*) Can be expressed as the following formula (6):
Figure BDA0003364583260000171
wherein x is*Including xi,xi+1,xj。s*Included
Figure BDA0003364583260000172
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. x is the number ofjCharacterization and xiThe proximal position, i.e. the loop back position. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000173
Respectively representing a first switch variable, a second switch variable and a third switch variable. sig () characterizes the sigmoid function for changing a 0, 1 discrete variable to a continuous variableE.g. of
Figure BDA0003364583260000174
The relation function includes f1()、f2()、f3(). The relationship function may be a non-linear function. f. of1() The first relation function is a first relation function which represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And a second relation function, wherein the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. f. of3() And a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor. u. ofoi,ufi,uijRespectively, the integrals of the readings of the odometer sensor at a plurality of times (current time i and before), the integrals of the readings of the optical flow sensor at a plurality of times, and the integrals of the readings of the loop detection sensor at a plurality of times. Sigmaoi,∑fi,∑cijRespectively, representing the uncertainty. Sigmaoi,∑fi,∑cijRespectively, characterizing the uncertainty of the reading of the odometer sensor, the uncertainty of the reading of the optical flow sensor, the uncertainty of the reading of the loop detection sensor.
In some application scenarios in the above alternative implementation manner, the objective function further includes a constraint term of the third switching variable.
As an example, the objective function h (x)*,s*) Can be expressed as the following formula (7):
Figure BDA0003364583260000181
wherein x is*Including xi,xi+1,xj。s*Included
Figure BDA0003364583260000182
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. x is the number ofjCharacterization with xiThe proximal position, i.e. the loop back position. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000183
Respectively representing a first switch variable, a second switch variable and a third switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000191
The relation function includes1()、f2()、f3(). The relationship function may be a non-linear function. f. of1() The first relation function is a first relation function which represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And a second relation function, wherein the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. f. of3() And a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor. u. ofoi,ufi,uijRespectively, the integrals of the readings of the odometer sensor at a plurality of times (current time i and before), the integrals of the readings of the optical flow sensor at a plurality of times, and the integrals of the readings of the loop detection sensor at a plurality of times. Sigmaoi,∑fi,∑cijAnd
Figure 5
respectively, representing the uncertainty. Sigmaoi,∑fi,∑cijSeparately characterizing readings of odometer sensorsUncertainty of number, uncertainty of reading of optical flow sensor, uncertainty of reading of loop detection sensor.
Figure 6
The uncertainty of the third switching variable is characterized. The constraint term of the third switching variable is
Figure BDA0003364583260000193
rijRepresenting a priori data with an initial value of 1.
As yet another example, the objective function h (x)*,s*) It can also be expressed as the following formula (8):
Figure BDA0003364583260000201
wherein x is*Including xi,xi+1,xj。s*Included
Figure BDA0003364583260000202
xiCharacterizing the current position of the robot. x is the number ofi+1And representing the next position information corresponding to the current position information of the robot. x is the number ofjCharacterization and xiThe proximal position, i.e. the loop back position. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000203
Respectively representing a first switch variable, a second switch variable and a third switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000204
The relation function includes f1()、f2()、f3(). The relationship function may be a non-linear function. f. of1() A first relation function characterizing a distance between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensorThe corresponding relationship of (1). f. of2() And a second relation function, wherein the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. f. of3() And a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor. u. ofoi,ufi,uijRespectively, the integrals of the readings of the odometer sensor at a plurality of times (current time i and before), the integrals of the readings of the optical flow sensor at a plurality of times, and the integrals of the readings of the loop detection sensor at a plurality of times. Sigmaoi,∑fi,∑cijAnd
Figure 4
respectively, representing the uncertainty. Sigmaoi,∑fi,∑cijRespectively, characterizing the uncertainty of the reading of the odometer sensor, the uncertainty of the reading of the optical flow sensor, the uncertainty of the reading of the loop detection sensor.
Figure 3
And respectively representing the uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable. The constraint term of the first switching variable is
Figure BDA0003364583260000213
The constraint term of the second switching variable is
Figure BDA0003364583260000214
The constraint term of the third switching variable is
Figure BDA0003364583260000215
rijRepresenting a priori data with an initial value of 1.
Here, the switching variables (including the first switching variable, the second switching variable, and the third switching variable) may be implemented by using functions such as tanh, in addition to the Sigmoid function. Further, in the objective function, in addition to the difference between the relationship function and the next position information or the loop back position information, a quotient between the relationship function and the next position information or the loop back position information, or the like may be calculated. Embodiments of the present disclosure are not limited in this regard.
Step 203, determining the next position information corresponding to the obtained current position information based on the objective function.
In this embodiment, the executing entity may determine, based on the objective function, next location information corresponding to the obtained current location information.
In some optional implementation manners of this embodiment, the executing main body may execute 203 as follows: and calculating next position information corresponding to the current position information by taking the minimum value obtained by the target function as a target.
It can be understood that, in the alternative implementation manner, the minimum value obtained by the objective function may be used as a target, and the next position information corresponding to the current position information is calculated, so that the accuracy of determining the position of the robot may be further improved.
Optionally, a least square method may be further used to calculate next position information corresponding to the current position information by using the objective function.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the position determination method of the robot according to the present embodiment. In fig. 3, the robot 310 includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The robot 310 first acquires its own current position information 301 and the reading information 302 of the above-described target sensor. Then, based on the current position information 301 and the reading information 302, an objective function 303 is constructed. The objective function 303 includes a switching variable for controlling a magnitude of a function value of a relation function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the objective sensor. Finally, the robot 310 determines the next position information 304 corresponding to the acquired current position information 301 based on the objective function 303.
In the position determining method of the robot according to the above embodiment of the present disclosure, the robot includes an object sensor, and the object sensor includes at least one of an odometer sensor and an optical flow sensor. Accordingly, current position information of the robot and reading information of the target sensor are acquired, then, an objective function is constructed based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the target sensor, and finally, next position information corresponding to the acquired current position information is determined based on the objective function. The embodiment of the disclosure controls the magnitude of the function value of the relation function through the switch variable, so that the magnitude of the function value of the relation function can be limited when at least one of the odometer sensor and the optical flow sensor provides wrong reading information, thereby reducing the influence degree of the sensor providing wrong information on the position determination, and further improving the accuracy of the position determination of the robot.
With further reference to fig. 4, fig. 4 shows a flow of yet another embodiment of a method of position determination of a robot. The robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The process of the robot position determining method comprises the following steps:
step 401, obtaining the current position information of the robot and the reading information of the target sensor.
Step 402, constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switch variable and the second switch variable.
The first switching variable controls a magnitude of a function value of a first relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the odometer sensor, and the second switching variable controls a magnitude of a function value of a second relational function representing a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the optical flow sensor.
Step 403, determining next position information corresponding to the obtained current position information based on the objective function.
Specifically, for the multi-loop problem, there may be multiple different paths from position a to position b. If the graph is used for representing, nodes in the graph represent positions, and edges represent paths to the positions, then multiple paths can be reached from one node to another node.
Here, the specific implementation manner of steps 401 to 403 may refer to the related description of fig. 2, and is not described herein again.
In application scenarios such as position prediction and path planning of a robot, a pose graph optimization problem is usually solved by using a odometer sensor, an optical flow sensor and a loop detection sensor which are arranged in the robot. However, if reading information of the odometer sensor, the optical flow sensor and the loop detection sensor is wrong, optimization of the whole pose graph is seriously influenced.
Further, as an example, after acquiring the current position information of the robot and the reading information of the target sensor, the following formula (9) may be adopted to solve x*s*
Figure BDA0003364583260000241
Wherein x is*Including xi,xi+1,xj。s*Included
Figure BDA0003364583260000251
xiCharacterizing the current position of the robot. x is a radical of a fluorine atomi+1And representing the next position information corresponding to the current position information of the robot. x is the number ofjCharacterization and xiThe proximal position, i.e. the loop back position. i. j is used to identify the time of day. The switch variable comprises
Figure BDA0003364583260000252
Respectively representing a first switch variable, a second switch variable and a third switch variable. sig () characterizes a sigmoid function for changing a 0, 1 discrete variable to a continuous variable, e.g.
Figure BDA0003364583260000253
The relation function includes1()、f2()、f3(). The relationship functions may be non-linear functions, respectively. f. of1() The first relation function is a first relation function which represents a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the odometer sensor. f. of2() And a second relation function, wherein the second relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the optical flow sensor. f. of3() And a third relation function, which represents a correspondence between the current position of the robot and the loop position of the robot calculated based on the reading information of the loop detection sensor. u. ofoi,ufi,uijRespectively, the integrals of the readings of the odometer sensor at a plurality of times (current time i and before), the integrals of the readings of the optical flow sensor at a plurality of times, and the integrals of the readings of the loop detection sensor at a plurality of times. Sigmaoi,∑fi,∑cijAnd
Figure 2
respectively, representing the uncertainty. Sigmaoi,∑fi,∑cijUncertainty in the respective characterization of the readings of the odometer sensor,Uncertainty in the readings of the optical flow sensor, uncertainty in the readings of the loop detection sensor.
Figure 1
And respectively representing the uncertainty of the first switching variable, the uncertainty of the second switching variable and the uncertainty of the third switching variable.
The above equation (9) also satisfies the following constraints of equation (10) to equation (15):
xi+1=f1(xi,uoi)+woiformula (10)
xi+1=f2(xi,ufi)+wfiFormula (11)
xj=f3(xi,uij)+wijFormula (12)
Figure BDA0003364583260000261
Figure BDA0003364583260000262
Figure BDA0003364583260000263
In formula (9) -formula (15), woi、wfi、wijAll obey zero mean and variance as sigmaoi、∑fi、∑cijIs a Gaussian distribution of and, rijRepresenting a priori data with an initial value of 1.
The embodiment provides a more general method for processing the erroneous readings of the sensors, the three loops of the pose graph simultaneously act, if the readings of the odometer sensor and the optical flow sensor in the loop interval are erroneous, the map data after the loop is not influenced, and the visual correction still acts, so that the accuracy of map optimization is ensured. In addition, the robustness is high, and the situation map disorder caused by the skidding of wheels of the robot or the error of loop detection can be avoided.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of a position determining apparatus of a robot, the apparatus embodiment corresponds to the above method embodiment, and the apparatus embodiment may further include the same or corresponding features as the above method embodiment and produce the same or corresponding effects as the above method embodiment, in addition to the features described below. The device can be applied to various electronic equipment.
As shown in fig. 5, the robot in the position determining apparatus 500 of the robot of the present embodiment includes an object sensor including at least one of an odometer sensor and an optical flow sensor. The above apparatus 500 includes: an acquisition unit 501, a construction unit 502 and a determination unit 503. The acquiring unit 501 is configured to acquire current position information of the robot and reading information of the target sensor; a construction unit 502 configured to construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relation function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the target sensor; a determining unit 503 configured to determine next position information corresponding to the acquired current position information based on the objective function.
In this embodiment, the acquiring unit 501 of the robot position determining apparatus 500 may acquire the current position information of the robot and the reading information of the target sensor.
In this embodiment, the construction unit 502 may construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the target sensor.
In this embodiment, the determining unit 503 may determine the next position information corresponding to the acquired current position information based on the objective function.
In some optional implementations of this embodiment, the constructing unit 502 includes:
a first constructing subunit (not shown in the figure) configured to construct an objective function based on the current position information, the reading information, and an uncertainty of the reading information.
In some optional implementations of this embodiment, the target sensor includes an odometer sensor and an optical flow sensor; and
the above-described construction unit 502 includes:
a second construction subunit (not shown in the figure) configured to construct an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switching variable, and the second switching variable;
the first switching variable controls the magnitude of a function value of a first relational function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the read information of the odometer sensor, and the second switching variable controls the magnitude of a function value of a second relational function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the read information of the optical flow sensor.
In some optional implementations of this embodiment, the objective function further includes a constraint term of the first switching variable and a constraint term of the second switching variable.
In some optional implementations of this embodiment, the robot further includes a loop detection sensor; and
the above-mentioned construction unit 502 includes:
a third constructing subunit (not shown in the figure) configured to construct an objective function based on the current position information, the reading information, and the reading information of the loop detection sensor;
the objective function further includes a third switching variable for controlling a magnitude of a function value of a third relation function representing a correspondence between the current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor.
In some optional implementations of this embodiment, the objective function further includes a constraint term of the third switching variable.
In some optional implementations of this embodiment, the determining unit 503 includes:
and a calculating subunit (not shown in the figure) configured to calculate next position information corresponding to the current position information, with the objective function obtaining the minimum value as a target.
The above-described embodiments of the present disclosure provide an apparatus 500 wherein the robot includes an object sensor including at least one of an odometer sensor and an optical flow sensor. An acquiring unit 501 acquires current position information of the robot and reading information of the target sensor; a construction unit 502 that constructs an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function representing a correspondence between the current position of the robot and a next position of the robot calculated based on the reading information of the target sensor; the determination unit 503 determines the next position information corresponding to the acquired current position information based on the objective function. In this way, by controlling the magnitude of the function value of the relational function by the switching variable, when at least one of the odometer sensor and the optical flow sensor provides error information, the magnitude of the function value of the relational function can be limited, the degree of influence of the sensor providing the error information on the position determination can be reduced, and the accuracy of the position determination of the robot can be improved.
Fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure, where the electronic device 600 shown in fig. 6 includes: at least one processor 601, memory 602, and at least one network interface 604 and other user interfaces 603. The various components in the electronic device 600 are coupled together by a bus system 605. It is understood that the bus system 605 is used to enable communications among the components. The bus system 605 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 605 in fig. 6.
The user interface 603 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, etc.).
It will be appreciated that the memory 602 in embodiments of the disclosure may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (ddr Data Rate SDRAM, ddr SDRAM), Enhanced Synchronous SDRAM (ESDRAM), synchlronous SDRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The memory 602 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 602 stores the following elements, executable units or data structures, or a subset thereof, or an expanded set thereof: an operating system 6021 and application programs 6022.
The operating system 6021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program 6022 includes various application programs such as a Media Player (Media Player), a Browser (Browser), and the like, and is used to implement various application services. Programs that implement methods of embodiments of the disclosure can be included in the application program 6022.
In the embodiment of the present disclosure, by calling a program or an instruction stored in the memory 602, specifically, a program or an instruction stored in the application program 6022, the processor 601 is configured to execute the method steps provided by the method embodiments, for example, including: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable for controlling the magnitude of a function value of a relation function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the reading information of the target sensor; and determining next position information corresponding to the obtained current position information based on the objective function.
The method disclosed by the embodiment of the present disclosure can be applied to the processor 601 or implemented by the processor 601. The processor 601 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 601. The Processor 601 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present disclosure may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present disclosure may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 602, and the processor 601 reads the information in the memory 602 and completes the steps of the method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units performing the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The electronic device provided in this embodiment may be the electronic device shown in fig. 6, and may execute all steps of the method for determining the position of the robot shown in fig. 2, so as to achieve the technical effect of the method for determining the position of the robot shown in fig. 2.
The disclosed embodiments also provide a storage medium (computer-readable storage medium). The storage medium herein stores one or more programs. Among others, the storage medium may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, a hard disk, or a solid state disk; the memory may also comprise a combination of memories of the kind described above.
When one or more programs in the storage medium are executable by one or more processors, the method for determining the position of the robot performed on the electronic device side is implemented.
The processor is configured to execute the communication program stored in the memory to implement the following steps of the method for determining a position of a robot, performed on the electronic device side: acquiring current position information of the robot and reading information of the target sensor; constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable for controlling the magnitude of a function value of a relation function representing the correspondence between the current position of the robot and the next position of the robot calculated based on the reading information of the target sensor; and determining next position information corresponding to the acquired current position information based on the objective function.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments, objects, technical solutions and advantages of the present disclosure are described in further detail, it should be understood that the above-mentioned embodiments are merely illustrative of the present disclosure and are not intended to limit the scope of the present disclosure, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A method of position determination for a robot, the robot including an object sensor including at least one of an odometer sensor and an optical flow sensor, the method comprising:
acquiring current position information of the robot and reading information of the target sensor;
constructing an objective function based on the current position information and the reading information, wherein the objective function comprises a switch variable used for controlling the magnitude of a function value of a relation function, and the relation function represents the corresponding relation between the current position of the robot and the next position of the robot calculated based on the reading information of the target sensor;
and determining next position information corresponding to the acquired current position information based on the objective function.
2. The method of claim 1, wherein constructing an objective function based on the current location information and the reading information comprises:
and constructing an objective function based on the current position information, the reading information and the uncertainty of the reading information.
3. The method of claim 1, wherein the target sensor includes an odometer sensor and an optical flow sensor; and
the constructing an objective function based on the current position information and the reading information comprises:
constructing an objective function based on the current position information, the reading information of the odometer sensor, the reading information of the optical flow sensor, the first switch variable and the second switch variable;
wherein the first switch variable is used to control a magnitude of a function value of a first relation function, the second switch variable is used to control a magnitude of a function value of a second relation function, the first relation function represents a correspondence between a current position of the robot and a next position of the robot calculated based on reading information of the odometer sensor, and the second relation function represents a correspondence between the current position of the robot and the next position of the robot calculated based on reading information of the optical flow sensor.
4. The method of claim 3, wherein the objective function further comprises a constraint term for the first switching variable and a constraint term for the second switching variable.
5. The method of claim 1, wherein the robot further comprises a loop detection sensor; and
constructing an objective function based on the current position information and the reading information, including:
constructing an objective function based on the current position information, the reading information and the reading information of the loop detection sensor;
wherein the objective function further includes a third switching variable for controlling a magnitude of a function value of a third relation function representing a correspondence between a current position of the robot and a loop position of the robot calculated based on the reading information of the loop detection sensor.
6. The method of claim 5, wherein the objective function further comprises a constraint term for the third switching variable.
7. The method according to any one of claims 1 to 6, wherein the determining the next location information corresponding to the obtained current location information based on the objective function comprises:
and calculating the next position information corresponding to the current position information by taking the minimum value obtained by the target function as a target.
8. A position determining apparatus of a robot, the robot including a target sensor including at least one of an odometer sensor and an optical flow sensor, the apparatus comprising:
an acquisition unit configured to acquire current position information of the robot and reading information of the target sensor;
a construction unit configured to construct an objective function based on the current position information and the reading information, wherein the objective function includes a switching variable for controlling a magnitude of a function value of a relationship function that characterizes a correspondence between a current position of the robot and a next position of the robot calculated based on the reading information of the target sensor;
a determination unit configured to determine next position information corresponding to the acquired current position information based on the objective function.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing a computer program stored in the memory, and when executed, implementing the method of any of the preceding claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of the preceding claims 1 to 7.
CN202111392025.7A 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium Active CN114088085B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111392025.7A CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111392025.7A CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114088085A true CN114088085A (en) 2022-02-25
CN114088085B CN114088085B (en) 2023-06-23

Family

ID=80303069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111392025.7A Active CN114088085B (en) 2021-11-19 2021-11-19 Position determining method and device for robot, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114088085B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850615A (en) * 2015-05-14 2015-08-19 西安电子科技大学 G2o-based SLAM rear end optimization algorithm method
US20190114798A1 (en) * 2017-10-17 2019-04-18 AI Incorporated Methods for finding the perimeter of a place using observed coordinates
CN110167137A (en) * 2019-05-08 2019-08-23 安克创新科技股份有限公司 The determination method and device of target object
CN110986930A (en) * 2019-11-29 2020-04-10 北京三快在线科技有限公司 Equipment positioning method and device, electronic equipment and storage medium
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium
CN112254741A (en) * 2020-09-09 2021-01-22 安克创新科技股份有限公司 Method for detecting abnormality of mileage sensor, self-moving robot, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850615A (en) * 2015-05-14 2015-08-19 西安电子科技大学 G2o-based SLAM rear end optimization algorithm method
US20190114798A1 (en) * 2017-10-17 2019-04-18 AI Incorporated Methods for finding the perimeter of a place using observed coordinates
CN110167137A (en) * 2019-05-08 2019-08-23 安克创新科技股份有限公司 The determination method and device of target object
CN110986930A (en) * 2019-11-29 2020-04-10 北京三快在线科技有限公司 Equipment positioning method and device, electronic equipment and storage medium
CN111337018A (en) * 2020-05-21 2020-06-26 上海高仙自动化科技发展有限公司 Positioning method and device, intelligent robot and computer readable storage medium
CN112254741A (en) * 2020-09-09 2021-01-22 安克创新科技股份有限公司 Method for detecting abnormality of mileage sensor, self-moving robot, and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
M. KHAN 等: "On the Positioning of Sensors with Simultaneous Bearing and Range Measurement in Wireless Sensor Networks", 《IFAC-PAPERSONLINE》, vol. 52, no. 24, pages 334 - 339 *
杨执钧;刘刚;黄蕾;乔丹;白雪;钟韬;: "基于视觉惯性的非结构化场景重构测距", 计算机应用研究, no. 1, pages 2 - 3 *

Also Published As

Publication number Publication date
CN114088085B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
Simon et al. Unified forms for Kalman and finite impulse response filtering and smoothing
CN112101530B (en) Neural network training method, device, equipment and storage medium
JP6853148B2 (en) Detection device, detection method and detection program
KR102237559B1 (en) Method for Generating Road Topology Information Based on Segment Modeling for High Definition Map
CN108332758A (en) A kind of corridor recognition method and device of mobile robot
JP2019003546A (en) Method for adjusting output level of neuron of multilayer neural network
JP2012518834A5 (en)
CN114462594A (en) Neural network training method and device, electronic equipment and storage medium
KR101742119B1 (en) Apparatus and Method for Hybrid System Modeling and Simulation assembling a Discrete Event System Model and Continuous Time System Model
CN115900742A (en) Reference trajectory planning method and device for vehicle, electronic equipment and storage medium
KR102045415B1 (en) Method FOR DETERMINING AN OPTIMAL StatisticAL MODEL AUTOMATICALLY and Apparatus tHEREOF
US20200201342A1 (en) Obstacle avoidance model generation method, obstacle avoidance model generation device, and obstacle avoidance model generation program
KR102190105B1 (en) Method for determining parameter sets of an artificial neural network
JP6701799B2 (en) Iterative test generation based on data source analysis
CN114088085A (en) Robot position determining method and device, electronic equipment and storage medium
CN109343013B (en) Spatial registration method and system based on restarting mechanism
JP6954346B2 (en) Parameter estimator, parameter estimator, and program
CN110705159A (en) Heat source model parameter solving method, device, equipment and storage medium
EP3545441A1 (en) Systems and methods for satisfiability modulo theories processes using uninterpreted function symbols
JP7252862B2 (en) Control device, control system and control method
Paul Designing efficient software for solving delay differential equations
CN112990256A (en) Multi-sensor noise calculation method, information fusion method, storage medium and device
US8136069B2 (en) Accurate approximation of resistance in a wire with irregular biasing and determination of interconnect capacitances in VLSI layouts in the presence of Catastrophic Optical Proximity Correction
KR102315622B1 (en) Method and apparatus for determining training data for updating algorithm
Shen-tu et al. Feedback structure based entropy approach for multiple-model estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant