CN112171659A - Robot and method and device for identifying limited area of robot - Google Patents

Robot and method and device for identifying limited area of robot Download PDF

Info

Publication number
CN112171659A
CN112171659A CN202010827744.6A CN202010827744A CN112171659A CN 112171659 A CN112171659 A CN 112171659A CN 202010827744 A CN202010827744 A CN 202010827744A CN 112171659 A CN112171659 A CN 112171659A
Authority
CN
China
Prior art keywords
robot
base station
positioning base
distance
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010827744.6A
Other languages
Chinese (zh)
Inventor
聂鹏
熊友军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN202010827744.6A priority Critical patent/CN112171659A/en
Publication of CN112171659A publication Critical patent/CN112171659A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application is suitable for the technical field of robots, and provides a robot and a method and a device for identifying a limited area of the robot, wherein the limited area, a first positioning base station and a second positioning base station are arranged in a scene where the robot is located, a positioning signal detection module is arranged on the robot, and the method comprises the following steps: the robot detects a positioning signal through a positioning signal detection module; determining a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station according to the detected positioning signals; and determining the relative position relationship between the robot and the restricted area according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station. The position relation between the robot and the positioning base station can be calculated in real time, so that the position relation between the robot and the restricted area can be more accurately determined, the probability of the robot entering the restricted area can be reduced, and the safety and the reliability of the movement of the robot are improved.

Description

Robot and method and device for identifying limited area of robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a robot and a method and a device for identifying a limited area of the robot.
Background
During autonomous navigation of the robot, there may be unfriendly or dangerous restricted areas in the scene in which the robot is located. For example, there may be obstacles such as glass and steps in the scene where the robot is located, which affect the movement of the robot. The robot needs to actively detect or avoid these restricted areas during its movement.
In order to enable the robot to safely and reliably move in the scene and complete the set navigation task, the position of the restricted area is usually marked in the scene image constructed by the robot, for example, a virtual wall is marked in the scene image, and the robot is prevented from entering the restricted area by the marked virtual wall. However, if the robot positioning information is inaccurate, the robot may not be able to accurately identify the position of the virtual wall, and therefore, the virtual wall may not reliably limit the movement of the robot, which is not beneficial to improving the safety and reliability of the movement of the robot.
Disclosure of Invention
The embodiment of the application provides a robot and a method and a device for identifying a limited area of the robot, and can solve the problem that the limited area marked in a scene image is not beneficial to improving the safety and reliability of robot movement.
In a first aspect, an embodiment of the present application provides a method for identifying a restricted area of a robot, where a scene where the robot is located is provided with the restricted area, a first positioning base station and a second positioning base station, the robot is provided with a positioning signal detection module, and the method for identifying the restricted area of the robot includes:
the robot detects positioning signals of the first positioning base station and the second positioning base station through a positioning signal detection module;
determining a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station according to the detected positioning signals;
and determining the relative position relationship between the robot and the restricted area according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station.
With reference to the first aspect, in a possible implementation manner of the first aspect, the determining a relative positional relationship between the robot and the restricted area includes:
determining a first circle where the robot is located according to the position of the first positioning base station and the first distance, and determining a second circle where the robot is located according to the position of the second positioning base station and the second distance;
acquiring a first intersection point and a second intersection point of the intersection of the first circle and the second circle;
acquiring a robot track of a preset time period before the current time;
and determining the position of the robot as a first intersection point or a second intersection point according to the track of the robot.
With reference to the first aspect, in a second possible implementation manner of the first aspect, when the boundary of the restricted area is a straight boundary, and the first positioning base station and the second positioning base station are disposed at two ends of the straight boundary, the step of determining a relative positional relationship between the robot and the restricted area includes:
summing the first distance and the second distance to obtain a first sum value;
and when the difference value between the first sum value and a third distance is smaller than a preset first distance threshold value, generating a limitation area prompt, wherein the third distance is the distance between the first positioning base station and the second positioning base station.
With reference to the first aspect, in a third possible implementation manner of the first aspect, when the boundary of the restricted area is an arc boundary, and the first positioning base station and the second positioning base station are located on the arc boundary and located at the same diameter, the step of determining the relative position relationship between the robot and the restricted area includes:
respectively obtaining the square values of the first distance and the second distance, and then summing to obtain a second sum value;
acquiring a diameter square value corresponding to the arc;
and generating a limited area prompt according to the comparison result of the second sum and the diameter square value.
With reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, the boundary of the restricted area is a concave arc, and the step of generating a restricted area prompt according to a comparison result between the second sum and the diameter square value includes:
when the robot is positioned in the concave arc and moves towards the concave arc, acquiring a difference value obtained by subtracting the second sum value from the diameter square value;
and when the difference value obtained by subtracting the second sum value from the diameter square value is smaller than a preset second distance threshold value, generating a limited area prompt.
With reference to the third possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the boundary of the restricted area is a convex arc, and the step of generating a restricted area prompt according to a comparison result between the second sum and the diameter square value includes:
obtaining a difference value of subtracting the second sum value from the diameter square value;
and when the difference value obtained by subtracting the diameter square value from the second sum value is smaller than a preset third distance threshold value, generating a limited area prompt.
With reference to the first aspect, the first possible implementation manner of the first aspect, the second possible implementation manner of the first aspect, the third possible implementation manner of the first aspect, the fourth possible implementation manner of the first aspect, or the fifth possible implementation manner of the first aspect, in a sixth possible implementation manner of the first aspect, the first positioning base station or the second positioning base station is an ultra wide band base station, and the positioning signal detection module is an ultra wide band tag.
In a second aspect, an embodiment of the present application provides a restricted area identification device for a robot, where a restricted area, a first positioning base station, and a second positioning base station are provided in a scene where the robot is located, a positioning signal detection module is provided on the robot, and the restricted area identification device for the robot includes:
the positioning signal detection unit is used for detecting positioning signals of the first positioning base station and the second positioning base station by the robot through the positioning signal detection module;
the distance determining unit is used for determining a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station according to the detected positioning signals;
and the position relation determining unit is used for determining the relative position relation between the robot and the limited area according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station.
In a third aspect, an embodiment of the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the restricted area identification method of the robot according to any one of the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program, when executed by a processor, implements the restricted area identification method for a robot according to any one of the first aspect.
In a fifth aspect, the present application provides a computer program product, when the computer program product runs on a robot, the robot is caused to execute the robot limitation and distinction recognition method according to any one of the above first aspects.
Compared with the prior art, the embodiment of the application has the advantages that: the robot positioning method comprises the steps that through a first positioning base station and a second positioning base station which are arranged in a scene where the robot is located, positioning signals of the first positioning base station and the second positioning base station are detected through a positioning signal detection module arranged on the robot, a first path and a second distance are respectively determined according to the positioning signals, and the position of the first positioning base station and the position of the second positioning base station which are preset are combined according to the first distance and the second distance.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic view of an implementation scenario of robot navigation positioning provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of an implementation of a restricted area identification method for a robot according to an embodiment of the present application;
fig. 3 is a schematic diagram of determining a position of a robot relative to a positioning base station according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a robot for determining a relationship with a restricted area according to an embodiment of the present application;
FIG. 5 is a schematic diagram of another embodiment of the present application for determining a relationship between a robot and a restricted area;
FIG. 6 is a schematic diagram of determining a relationship of a robot to an area provided by yet another embodiment of the present application;
fig. 7 is a schematic structural diagram of a robot provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this application refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
As used in this application, the term "if" may be interpreted contextually as "when …" or "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Fig. 1 is a schematic view of an implementation scenario of robot navigation positioning according to an embodiment of the present application. As shown in fig. 1, the scene detected by the robot sensing unit, including the detecting device such as the radar detector and the depth sensor, is schematic. In the scene diagram, a gray area represents the presence of an obstacle, and a white area represents a passable area. Due to defects of the robot sensing unit, there may be undetected obstacles such as glass, etc., or undetected dangerous factors such as puddles, steps, etc. In order to avoid the robot entering these restricted areas and to improve the safety and reliability of the movement of the robot R, the scene schematic may be labeled by a worker. For example, two dotted lines (i.e., virtual wall 1 and virtual wall 2, respectively denoted by L1 and L2) marked in fig. 1 combine the left and right barrier regions to form a restricted region a.
The staff can accurately identify the virtual wall 1 and the virtual wall 2 in the scene schematic diagram by combining the field information. However, when the robot moves in the scene, errors such as offset may occur in the position of the robot R in the scene image, and the position of the robot relative to the virtual wall cannot be accurately obtained, so that a dangerous accident may occur in the moving process of the robot, and the reliability of the robot moving may be affected. Therefore, the positioning base stations are added in the scene where the robot is located, and the positioning base stations comprise a first positioning base station and a second positioning base station, such as U1 and U2 in fig. 1, based on the first positioning base station and the second positioning base station, the robot is accurately controlled, and the robot is prevented from entering a limited area.
Fig. 2 is a schematic implementation flow diagram of a restricted area identification method of a robot provided based on the implementation scenario shown in fig. 1, which is detailed as follows:
in step S201, the robot detects the positioning signals of the first positioning base station and the second positioning base station through the positioning signal detection module;
in this embodiment, the positioning signal detecting module is configured to receive positioning signals transmitted by the first positioning base station and the second positioning base station. The positioning signal can be a Bluetooth positioning signal, a Radio Frequency Identification (RFID) signal, a wireless fidelity (WIFI) signal or an Ultra Wide Band (UWB) signal. For example, in one implementation manner, the first positioning base station and the second positioning base station may be a first UWB base station and a second UWB base station, respectively, and the positioning signal detection module may be a UWB tag. A first positioning signal transmitted by a first UWB base station may be detected by a UWB tag, and a second positioning signal transmitted by a second UWB base station may be detected.
In step S202, according to the detected positioning signal, a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station are determined;
according to the detected positioning signal, the base station identifier included in the positioning signal can be obtained through analysis. And according to the analyzed base station identifier, the positioning base station for sending the positioning signal can be obtained. According to the receiving time of the positioning signal or the receiving strength of the positioning signal, the distance between the positioning base station (the first positioning base station or the second positioning base station) and the robot can be obtained. In combination with the base station identification included in the positioning signal (first positioning signal or second positioning signal), the correspondence between the identified distance and the positioning base station (first positioning base station or second positioning base station) can be determined.
For example, the positioning signal is a bluetooth signal, the positioning base station includes a first bluetooth base station and a second bluetooth base station, and a bluetooth receiver disposed on the robot receives a first bluetooth signal sent by the first bluetooth base station and receives a second bluetooth signal sent by the second bluetooth base station. And analyzing the identification information included in the Bluetooth signal (the first Bluetooth signal or the second Bluetooth signal) to obtain a base station (the first Bluetooth base station or the second Bluetooth base station) corresponding to the Bluetooth signal, and calculating the distance between the robot and the Bluetooth base station according to the received signal strength of the received Bluetooth signal (the first Bluetooth signal or the second Bluetooth signal). By combining the identification information included in the bluetooth signal, the calculated distance can be obtained as a first distance between the first bluetooth base station and the robot or a second distance between the second bluetooth base station and the robot.
For example, the positioning signal is an ultra-wideband UWB signal, and the positioning base station includes a first UWB base station and a second UWB base station. A UWB tag provided on the robot detects a first positioning signal transmitted by a first UWB base station, and receives a second positioning signal transmitted by a second UWB base station. And the robot analyzes the identification in the positioning signal (comprising the first positioning signal and the second positioning signal) to obtain a base station (a first UWB base station or a second UWB base station) corresponding to the positioning signal. The method comprises the steps of obtaining the transmission time of a positioning signal according to the time of receiving the positioning signal and the time of sending the positioning signal by combining a positioning base station, calculating the distance between the positioning base station (a first UWB base station or a second UWB base station) and a robot according to the transmission time, and determining the positioning base station corresponding to the calculated distance to be the first UWB base station or the second UWB base station according to identification information included in the positioning signal.
In step S203, the relative position relationship between the robot and the restricted area is determined according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station.
After the first distance between the robot and the first positioning base station and the second distance between the robot and the second positioning base station are calculated, the position of the positioning base station (the first positioning base station or the second positioning base station) is preset, so that the relative position between the positioning base station and the limited area can be accurately obtained. The relative position of the robot and the restricted area can be determined only by obtaining the relative position of the robot and the positioning base station.
In one implementation of determining the position of the robot relative to the positioning base station, as shown in fig. 3, the first positioning base station U1 and the second positioning base station U2, whose positions are determined in advance, transmit positioning signals, the robot receives the positioning signals, and calculates a first distance d1 between the robot and the first positioning base station and a second distance d2 between the robot and the second positioning base station according to the positioning signals. And obtaining a first circle by taking the first positioning base station as a circle center and the first distance d1 as a radius, obtaining a second circle by taking the second base station U2 as a circle center and the second distance d2 as a radius, and intersecting the first circle and the second circle to obtain two intersection positions of a first intersection point R1 and a second intersection point R2.
In order to obtain an accurate robot position, the robot track of a predetermined time period between the calculation of the positioning time may be combined to determine that the current position of the robot is the first intersection or the second intersection. For example, the current time is T1, the robot trajectory from T0 to T1 may be selected, and the position of the robot in the scene image at the current time is accurately obtained according to the change of the position in the robot trajectory.
After the position of the robot in the scene image is obtained, the distance between the robot and the limited area can be updated in real time, and when the distance between the robot and the limited area is smaller than a preset distance, the moving direction of the robot is adjusted to prevent the robot from moving to the limited area.
In one implementation manner, in order to quickly and accurately determine whether a currently processed position of the robot meets a safety requirement, as shown in fig. 4, a boundary of the restricted area is a straight boundary, and the first positioning base station and the second positioning base station are respectively disposed at two ends of the straight boundary. And determining a first distance d1 according to the positioning signal sent by the first positioning base station, determining a second distance d2 according to the positioning signal sent by the second positioning base station, and calculating the sum of the first distance d1 and the second distance d2 to obtain a first sum value. And acquiring the distance between the first positioning base station U1 and the second positioning base station U2, namely a third distance L3 according to the preset positions of the first positioning base station U1 and the second positioning base station U2. And if the difference value of the first sum minus the third distance L3 is smaller than the preset first distance threshold, generating a close limit area prompt so that the robot can adjust the moving direction in time. Under the condition, the relation between the current position of the robot and the limited area can be quickly determined only by simply comparing numerical values, so that the judgment result of the movement of the robot can be quickly and accurately obtained on the premise of ensuring the safety of the movement of the robot.
In one implementation manner, the boundary of the restricted area is an arc, and the first positioning base station and the second positioning base station are located on the arc boundary and on the same diameter, as shown in fig. 5, the boundary L of the restricted area is a concave arc, and the square values of the first distance D1 and the second distance D2 may be obtained and then summed to obtain a second sum, and the diameter square value corresponding to the arc diameter D is calculated. When the robot enters the circle corresponding to the concave circular arc, the difference obtained by subtracting the second sum from the diameter square value is compared with a preset second distance threshold, and if the difference obtained by subtracting the second sum from the diameter square value is smaller than the preset second distance threshold, a limited area prompt is generated, so that the robot can adjust the moving direction to move.
As shown in fig. 6, the boundary L of the restricted area is a convex circular arc, and the first positioning base station U1 and the second positioning base station U2 are located on the circular arc boundary and on the same diameter, and may obtain square values of the first distance D1 and the second distance D2, sum the square values to obtain a second sum value, and calculate a diameter square value corresponding to the circular arc diameter D. And in the moving process of the robot, comparing a difference value obtained by subtracting the diameter square value from the second sum value with a preset third distance threshold value, and if the difference value obtained by subtracting the diameter square value from the second sum value is smaller than the preset second distance threshold value, generating a limited area prompt to enable the robot to adjust the moving direction to move.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 7 shows a block diagram of a restricted area recognition apparatus for a robot according to an embodiment of the present application, which corresponds to the restricted area recognition method for a robot according to the above-described embodiment.
Referring to fig. 7, a restricted area, a first positioning base station and a second positioning base station are provided in a scene where a robot is located, a positioning signal detection module is provided on the robot, and the restricted area recognition apparatus of the robot includes:
a positioning signal detection unit 701, configured to detect, by the robot, positioning signals of the first positioning base station and the second positioning base station through a positioning signal detection module;
a distance determining unit 702, configured to determine, according to the detected positioning signal, a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station;
a position relation determining unit 703, configured to determine, according to the first distance and the second distance, a relative position relation between the robot and the restricted area by combining the positions of the first positioning base station and the second positioning base station.
The restricted area recognition device of the robot shown in fig. 7 corresponds to the restricted area recognition method shown in fig. 2.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a robot, including: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/robot, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier wave signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/robot and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/robot are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A robot restricted area identification method is characterized in that a restricted area, a first positioning base station and a second positioning base station are arranged in a scene where a robot is located, a positioning signal detection module is arranged on the robot, and the robot restricted area identification method comprises the following steps:
the robot detects positioning signals of the first positioning base station and the second positioning base station through a positioning signal detection module;
determining a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station according to the detected positioning signals;
and determining the relative position relationship between the robot and the restricted area according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station.
2. The restricted area recognition method of a robot according to claim 1, wherein the step of determining the relative positional relationship of the robot to the restricted area comprises:
determining a first circle where the robot is located according to the position of the first positioning base station and the first distance, and determining a second circle where the robot is located according to the position of the second positioning base station and the second distance;
acquiring a first intersection point and a second intersection point of the intersection of the first circle and the second circle;
acquiring a robot track of a preset time period before the current time;
and determining the position of the robot as a first intersection point or a second intersection point according to the track of the robot.
3. The method of claim 1, wherein when the boundary of the restricted area is a straight boundary and the first positioning base station and the second positioning base station are disposed at two ends of the straight boundary, the step of determining the relative positional relationship between the robot and the restricted area comprises:
summing the first distance and the second distance to obtain a first sum value;
and when the difference value between the first sum value and a third distance is smaller than a preset first distance threshold value, generating a limitation area prompt, wherein the third distance is the distance between the first positioning base station and the second positioning base station.
4. The method according to claim 1, wherein when the boundary of the restricted area is an arc boundary and the first positioning base station and the second positioning base station are located on the arc boundary and on the same diameter, the step of determining the relative positional relationship between the robot and the restricted area comprises:
respectively obtaining the square values of the first distance and the second distance, and then summing to obtain a second sum value;
acquiring a diameter square value corresponding to the arc;
and generating a limited area prompt according to the comparison result of the second sum and the diameter square value.
5. The method of identifying a restricted area of a robot according to claim 4, wherein the boundary of the restricted area is a concave arc, and the step of generating a restricted area prompt based on the comparison result between the second sum and the diameter square value includes:
when the robot is positioned in the concave arc and moves towards the concave arc, acquiring a difference value obtained by subtracting the second sum value from the diameter square value;
and when the difference value obtained by subtracting the second sum value from the diameter square value is smaller than a preset second distance threshold value, generating a limited area prompt.
6. The robot restricted area recognition method according to claim 4, wherein the boundary of the restricted area is a convex arc, and the step of generating a restricted area prompt based on the comparison result between the second sum and the diameter square value includes:
obtaining a difference value of subtracting the second sum value from the diameter square value;
and when the difference value obtained by subtracting the diameter square value from the second sum value is smaller than a preset third distance threshold value, generating a limited area prompt.
7. The robot restricted area identification method according to any one of claims 1 to 6, wherein the first positioning base station or the second positioning base station is an ultra-wideband base station, and the positioning signal detection module is an ultra-wideband tag.
8. The utility model provides a restricted area recognition device of robot which characterized in that is provided with restricted area, first location basic station and second location basic station in the scene of robot place, be provided with the locating signal detection module on the robot, the restricted area recognition device of robot includes:
the positioning signal detection unit is used for detecting positioning signals of the first positioning base station and the second positioning base station by the robot through the positioning signal detection module;
the distance determining unit is used for determining a first distance between the robot and the first positioning base station and a second distance between the robot and the second positioning base station according to the detected positioning signals;
and the position relation determining unit is used for determining the relative position relation between the robot and the limited area according to the first distance and the second distance and by combining the positions of the first positioning base station and the second positioning base station.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the restricted area identification method of the robot according to any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the restricted area identification method of a robot according to any one of claims 1 to 7.
CN202010827744.6A 2020-08-17 2020-08-17 Robot and method and device for identifying limited area of robot Pending CN112171659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010827744.6A CN112171659A (en) 2020-08-17 2020-08-17 Robot and method and device for identifying limited area of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010827744.6A CN112171659A (en) 2020-08-17 2020-08-17 Robot and method and device for identifying limited area of robot

Publications (1)

Publication Number Publication Date
CN112171659A true CN112171659A (en) 2021-01-05

Family

ID=73919332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010827744.6A Pending CN112171659A (en) 2020-08-17 2020-08-17 Robot and method and device for identifying limited area of robot

Country Status (1)

Country Link
CN (1) CN112171659A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113433266A (en) * 2021-06-10 2021-09-24 山东欧齐珞信息科技有限公司 Method and system for monitoring gas components in whole tunnel
CN114211487A (en) * 2021-12-14 2022-03-22 珠海一微半导体股份有限公司 Robot forbidden zone setting method, robot control method, chip and robot
CN114310876A (en) * 2021-12-20 2022-04-12 达闼机器人有限公司 Robot positioning method, system, processing device and medium
CN115047874A (en) * 2022-06-02 2022-09-13 北京三快在线科技有限公司 Robot connection method, storage cabinet, robot, system and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150323934A1 (en) * 2014-05-09 2015-11-12 Kinpo Electronics, Inc. Indoor robot and method for indoor robot positioning
CN105891774A (en) * 2016-03-29 2016-08-24 北京九星智元科技有限公司 Dynamic tracking positioning system and method for robot dolly
CN109959894A (en) * 2017-12-26 2019-07-02 深圳市优必选科技有限公司 Localization method and terminal device
CN110045329A (en) * 2018-01-17 2019-07-23 赵瑞祥 A kind of two base station two-dimensional location methods
CN110450152A (en) * 2019-06-24 2019-11-15 广东宝乐机器人股份有限公司 Region identification method, robot, and storage medium
CN111263299A (en) * 2020-03-03 2020-06-09 中国科学技术大学 Positioning method, positioning device, electronic equipment and storage medium
US20200209345A1 (en) * 2018-12-26 2020-07-02 Industry-Academic Cooperation Foundation, Chosun University Method and apparatus for location estimation of terminal in wireless communication system
CN111381586A (en) * 2018-12-11 2020-07-07 深圳市优必选科技有限公司 Robot and movement control method and device thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150323934A1 (en) * 2014-05-09 2015-11-12 Kinpo Electronics, Inc. Indoor robot and method for indoor robot positioning
CN105891774A (en) * 2016-03-29 2016-08-24 北京九星智元科技有限公司 Dynamic tracking positioning system and method for robot dolly
CN109959894A (en) * 2017-12-26 2019-07-02 深圳市优必选科技有限公司 Localization method and terminal device
CN110045329A (en) * 2018-01-17 2019-07-23 赵瑞祥 A kind of two base station two-dimensional location methods
CN111381586A (en) * 2018-12-11 2020-07-07 深圳市优必选科技有限公司 Robot and movement control method and device thereof
US20200209345A1 (en) * 2018-12-26 2020-07-02 Industry-Academic Cooperation Foundation, Chosun University Method and apparatus for location estimation of terminal in wireless communication system
CN110450152A (en) * 2019-06-24 2019-11-15 广东宝乐机器人股份有限公司 Region identification method, robot, and storage medium
CN111263299A (en) * 2020-03-03 2020-06-09 中国科学技术大学 Positioning method, positioning device, electronic equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113433266A (en) * 2021-06-10 2021-09-24 山东欧齐珞信息科技有限公司 Method and system for monitoring gas components in whole tunnel
CN114211487A (en) * 2021-12-14 2022-03-22 珠海一微半导体股份有限公司 Robot forbidden zone setting method, robot control method, chip and robot
CN114310876A (en) * 2021-12-20 2022-04-12 达闼机器人有限公司 Robot positioning method, system, processing device and medium
CN115047874A (en) * 2022-06-02 2022-09-13 北京三快在线科技有限公司 Robot connection method, storage cabinet, robot, system and electronic equipment
CN115047874B (en) * 2022-06-02 2023-09-15 北京三快在线科技有限公司 Robot connection method, locker, robot, system and electronic equipment

Similar Documents

Publication Publication Date Title
CN112171659A (en) Robot and method and device for identifying limited area of robot
CA2599406C (en) System and method for asset location in wireless networks
CN112995899B (en) Vehicle-road cooperative positioning method and device, vehicle-mounted positioning system and road side equipment
CN111309012A (en) Robot and movement control method and device thereof
CN112083441B (en) Obstacle detection method and system for depth fusion of laser radar and millimeter wave radar
JP6330471B2 (en) Wireless positioning device
CN111381586A (en) Robot and movement control method and device thereof
US11762074B2 (en) Position calibration method for infrastructure sensor apparatus, infrastructure sensor apparatus, a non-transitory computer readable medium storing infrastructure sensor system, and position calibration program
JP4424272B2 (en) Airport surface monitoring system and track integration device used therefor
CN104735781A (en) Indoor positioning system and positioning method thereof
CN111829525A (en) UWB (ultra wide band) indoor and outdoor integrated intelligent navigation positioning method and system
US11408988B2 (en) System and method for acoustic vehicle location tracking
CN108693517B (en) Vehicle positioning method and device and radar
JP4850531B2 (en) In-vehicle radar system
CN113960633A (en) Vehicle position correction method and device and server
CN112699700B (en) Intelligent robot positioning system and method based on radio frequency technology
JP2017199079A (en) Collision determination device, computer program, and collision determination method
CN114222365A (en) Ultra-wideband positioning method, device, computing unit and system based on single base station
KR20210066777A (en) System and method for correcting position information
KR101294284B1 (en) Method and apparatus for measuring position using radio communication
KR101448266B1 (en) System and Method for Correcting GPS error of Container Tracer
CN117288182B (en) Position information acquisition method, device, terminal and medium
CN112162275B (en) Target object identification method, device, equipment and storage medium
CN116424331B (en) Tractor, docking method of towed target and electronic equipment
CN117607797A (en) Indoor positioning method and device based on identification, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210105