WO2019109635A1 - 基于栅格地图的机器人监视宠物的方法及芯片 - Google Patents

基于栅格地图的机器人监视宠物的方法及芯片 Download PDF

Info

Publication number
WO2019109635A1
WO2019109635A1 PCT/CN2018/094744 CN2018094744W WO2019109635A1 WO 2019109635 A1 WO2019109635 A1 WO 2019109635A1 CN 2018094744 W CN2018094744 W CN 2018094744W WO 2019109635 A1 WO2019109635 A1 WO 2019109635A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
unit
uwb positioning
determining
base station
Prior art date
Application number
PCT/CN2018/094744
Other languages
English (en)
French (fr)
Inventor
肖刚军
黄泰明
Original Assignee
珠海市一微半导体有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 珠海市一微半导体有限公司 filed Critical 珠海市一微半导体有限公司
Priority to US16/768,697 priority Critical patent/US11470821B2/en
Priority to KR1020207019551A priority patent/KR102320370B1/ko
Priority to JP2020531027A priority patent/JP7136898B2/ja
Priority to EP18886384.9A priority patent/EP3723423B1/en
Publication of WO2019109635A1 publication Critical patent/WO2019109635A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0094Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0217Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management

Definitions

  • the invention relates to the field of robots, and in particular to a method and a chip for monitoring a pet by a robot based on a grid map.
  • the present invention provides a method and a chip for monitoring a pet based on a grid map, which can better determine the position of the robot to monitor the pet, thereby achieving a better monitoring effect.
  • the specific technical solutions of the present invention are as follows:
  • a method for monitoring pets by a robot based on a grid map comprising the following steps:
  • Step one based on the grid map constructed by the robot, determining a current position point of the robot in the grid map and a corresponding grid unit;
  • Step 2 determining wireless positional relationship between the pet and the robot based on wireless communication between the robot and the wireless signal device on the pet body, and determining a current position point of the pet and a corresponding grid unit according to the mutual positional relationship;
  • Step 3 determining whether there is an obstacle unit between the grid unit in the preset range covered by the shooting angle of the camera of the pet monitoring robot between the grid unit where the robot is located and the grid unit where the pet is located;
  • Step 4 determining a preset area centered on the grid unit where the pet is located, according to the distance relationship between the passing unit and the robot in the preset area, the unit has passed through the unit one by one Determining a monitoring unit, determining whether there is an obstacle unit in a straight-line grid path between the to-be-determined monitoring unit and the grid unit where the pet is located;
  • step five If not, determining that the pending monitoring unit is a monitoring unit, and proceeding to step five;
  • step 5 If yes, directly determine the next taken unit as the monitoring unit, and proceed to step 5;
  • Step 5 The control robot walks from the current position point to the monitoring unit to monitor the pet;
  • the obstacle unit is a grid unit corresponding to the robot when the obstacle is detected, and the passed unit is a grid unit that the robot has traveled.
  • the robot-based grid map according to step 1 determines the current position point and the corresponding grid unit of the robot in the grid map, and includes the following steps:
  • the wireless communication based on the wireless signal device of the robot and the pet body described in step 2 determines the mutual positional relationship between the pet and the robot, and determines the current position point of the pet and the corresponding grid unit according to the mutual positional relationship, including The following steps:
  • the first distance of the UWB locating base station to the first UWB locating base station is R1
  • the second distance from the UWB positioning tag to the second UWB positioning base station is R2;
  • the determining the coordinates of the first UWB positioning base station on the robot body is (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), including the following steps:
  • the distance between the first UWB positioning base station and the second UWB positioning base station is W
  • the distance from the center point of the robot body to the first UWB positioning base station is W/2
  • the center point of the robot body is The second UWB locates the base station by a distance of W/2;
  • Determining, between the first UWB positioning base station, the time when the ranging data is sent to the UWB positioning tag to the acknowledgment signal for receiving the UWB positioning tag is T11;
  • Determining that the UWB positioning tag receives the ranging data sent by the first UWB positioning base station to send an acknowledgment signal is T12;
  • Determining, between the UWB positioning tag, the ranging data sent by the first UWB positioning base station to the acknowledgment signal of the first UWB positioning base station is T13;
  • Determining, between the second UWB positioning base station, the ranging data sent by the UWB positioning tag to the acknowledgment signal of the UWB positioning tag is T21;
  • Determining that the UWB positioning tag receives the ranging data sent by the second UWB positioning base station to send an acknowledgment signal is T22;
  • the robot monitors whether the grid unit within the preset range covered by the shooting angle of the camera of the pet has an obstacle unit. , including the following steps:
  • step 4 determining a preset area centered on the grid unit where the pet is located, according to the distance relationship between the passing unit and the robot in the preset area from near to far The unit has been passed as the pending monitoring unit, including the following steps:
  • the preset length is any value within a range of 1 meter to 2 meters.
  • controlling robot in step 5 walks from the current position point to the monitoring unit to monitor the pet, and includes the following steps:
  • the grid path having the shortest path length is used as the navigation grid path in the grid path directly connected by the unit;
  • the beneficial effects of the present invention are: determining the mutual positional relationship between the pet and the robot by wirelessly communicating with the robot through the wireless signal device on the pet, and then determining whether the robot and the pet correspond to the grid unit in the grid map.
  • the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined. Whether there is an obstacle unit in the path, that is, whether the pet can be effectively monitored from the position of the pending monitoring unit, if there is no obstacle occlusion, it is determined that the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed. In the preset area around the pet, the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
  • the farthest unit that is farthest from the robot is used.
  • the distribution of obstacles is characteristic, that is, obstacles are generally concentrated in a certain area or some areas, as long as one If an obstacle unit is detected in the area, there will be other obstacle units in the area.
  • the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
  • the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
  • the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, and improving the monitoring of the pet. effect.
  • FIG. 1 is a schematic flow chart of a method for monitoring a pet by a grid map based robot according to the present invention.
  • FIG. 2 is a schematic diagram showing the analysis of the coordinates of the position point converted into the coordinates of the grid unit according to the present invention.
  • FIG. 3 is a schematic diagram of mutual location analysis of two UWB positioning base stations and UWB positioning tags according to the present invention.
  • FIG. 4 is a schematic diagram of analyzing coordinates of two UWB positioning base stations according to coordinates of a robot center point according to the present invention.
  • FIG. 5 is a schematic diagram of analyzing the distance of a UWB positioning tag to a first UWB positioning base station.
  • Fig. 6 is a schematic diagram showing the analysis of the grid area photographed by the robot.
  • Figure 7 is a schematic diagram of the analysis of the determination monitoring unit.
  • FIG. 8 is an analysis diagram for determining a navigation path of a robot from a current position point to a monitoring position point.
  • the robot of the invention is a kind of intelligent household appliances, and can automatically walk in some occasions automatically with certain artificial intelligence.
  • the mobile robot of the present invention comprises the following structure: a robotic body capable of autonomous walking with a driving wheel, a human-computer interaction interface is arranged on the body, and an obstacle detecting unit is arranged on the body. There is a camera on the upper end of the middle part of the body.
  • the camera can also be placed on the upper end of the front part of the body or other positions. When it is set at the front of the body or other positions, it is necessary to adjust the relevant parameters when calculating the relevant parameters.
  • the value can be.
  • An inertial sensor is disposed inside the body, and the inertial sensor includes an accelerometer and a gyroscope.
  • the driving wheel is provided with an odometer (generally a code wheel) for detecting the walking distance of the driving wheel, and is also provided with a parameter capable of processing the relevant sensor. And can output a control signal to the control module of the execution component.
  • the method for monitoring pets by the grid map-based robot includes the following steps: Step 1: Based on the grid map constructed by the robot, determining the current position of the robot in the grid map And the corresponding grid unit; step two, based on the wireless communication between the robot and the wireless signal device on the pet body, determining the mutual positional relationship between the pet and the robot, and determining the current position point of the pet and the corresponding grid unit according to the mutual positional relationship Step 3: determining whether the grid unit in which the robot is located and the grid unit in which the pet is located, the robot monitors whether the grid unit within the preset range covered by the shooting angle of the camera of the pet has an obstacle unit; if not, Keeping the camera of the robot toward the shooting direction of the pet, and returning to step 2; if yes, proceeding to step 4; and step 4, determining a preset area centering on the grid unit where the pet is located, according to the preset area Has gone through the distance between the unit and the robot from the near to the
  • the grid map is a map in which a grid unit is a basic unit constructed by the robot according to data detected by various sensors during the walking process.
  • the grid unit is a virtual square having a set length and a width, and may be set to a square or a rectangle.
  • the grid unit of the present invention is a square lattice having a side length of 0.2 meters.
  • the wireless signal device may adopt a zigbee communication module, an ultrasonic module, a radio frequency communication module, a UWB (Ultra Wide Band) module or a wifi module, etc., and correspondingly select according to different needs of the product.
  • the preset range may also be correspondingly set according to different requirements of the product design.
  • the preset area is set to be one-third of the full range covered by the shooting angle of the camera.
  • the preset area may also be correspondingly set according to different requirements of the product design.
  • the preset area may be set as a circular area, a square area or a regular polygonal area, and the area size is generally set within a range of 2 to 6 square meters.
  • the robot will mark the grid unit that has passed through as having passed through the unit, and the grid unit corresponding to the obstacle is marked as an obstacle unit, and the grid unit corresponding to the cliff will be detected. Marked as a cliff unit, etc., and updated the raster map based on the information indicated.
  • the method of the present invention determines the mutual positional relationship between the pet and the robot by wirelessly communicating with the robot through the wireless signal device on the pet, and then determines whether the robot and the pet correspond to the grid unit in the grid map. There is an obstacle unit to determine whether there is an obstacle between the robot and the pet. If not, it indicates that the robot's current position and shooting direction can effectively capture the pet without changing the current position and shooting direction of the robot.
  • the robot will rotate the body and keep the camera always facing the pet, in the process.
  • the robot can walk to other locations unless obstacles are blocked. If there is, it means that the robot may shoot an obstacle at the current position and cannot capture the pet. Therefore, the robot needs to reselect the monitoring position by judging the state of the grid unit around the pet. In the preset area around the pet, the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined.
  • the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed.
  • the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
  • the farthest unit that is farthest from the robot is used.
  • the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
  • the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
  • the method of the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, thereby improving Monitor the effect of pets.
  • the robot-based grid map according to the first step determines the current position point of the robot in the grid map and the corresponding grid unit, and includes the following steps: according to the data detected by the robot during the walking process. Constructing a grid map of the XY coordinate system based on (X0, Y0) as the origin; determining the side length of the grid unit in the grid map as L; determining the current position of the robot based on the positioning data of the robot itself
  • the coordinates of the point are (X1, Y1)
  • the robot Since the robot is walking, it records the path that has been traveled based on the data detected by its own odometer and gyroscope, and determines its position and direction (ie, positioning data) in real time.
  • the grid map is composed of grid cells as the basic unit. Each grid cell contains a lot of position points.
  • the robot's walking is in the form of position points, that is, moving from the current position point to the adjacent point. The next location point. Therefore, when determining the coordinates of the grid unit in which the robot is currently located, it is necessary to convert the coordinates of the current position point into the coordinates of the grid unit, as shown in FIG.
  • each small square represents a grid unit, and the side length
  • the method in the same coordinate system can accurately calculate the grid coordinates of the grid unit corresponding to the current position point by the positional relationship between the current position point and the coordinate origin and the side length of the grid unit. To provide reliable data for subsequent data processing and improve the accuracy of data analysis.
  • the wireless communication based on the wireless signal device on the robot and the pet body in step 2 determines the mutual positional relationship between the pet and the robot, and determines the current position point of the pet and the corresponding grid unit according to the mutual positional relationship, including The following steps: determining that the distance between the first UWB positioning base station and the second UWB positioning base station on the robot body is W; determining that the coordinates of the first UWB positioning base station are (X11, Y11), and the second UWB positioning base station The coordinates of (X12, Y12) are determined based on the wireless communication between the first UWB positioning base station and the second UWB positioning base station and the UWB positioning tag on the pet, and determining the UWB positioning tag to the first UWB positioning base station
  • the first distance is R1
  • the second distance of the UWB positioning tag to the second UWB positioning base station is R2
  • the first UWB positioning base station is determined to be an angular vertice pointing to the second UWB
  • UWB Ultra Wideband
  • UWB positioning tag and UWB positioning base station is a communication device using UWB communication technology.
  • A is a first UWB positioning base station
  • B is a second UWB positioning base station
  • C is a UWB positioning tag.
  • the value of W should be smaller than the diameter of the robot body.
  • the first angle can be obtained from the three sides of the triangle ( ⁇ ABC)
  • the robot can determine its own coordinate position (ie, the coordinates of the center point of the robot) through the detection data of the sensors such as the odometer and the gyroscope, the coordinates of the two UWB positioning base stations on the robot body with respect to the position of the center point are fixed.
  • the value is also determinable, that is, the coordinates of the first UWB positioning base station are (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), and the specific calculation manner will be described in the following embodiments.
  • the grid coordinates of the grid unit corresponding to the current position point of the UWB positioning tag are calculated by the manner described in the foregoing embodiment (S21).
  • S21 (Xc-X0)/L
  • S22 (Yc-Y0)/L
  • both S21 and S22 take an integer.
  • the method described in this embodiment is applicable to the case where the height of the UWB positioning tag worn by the pet is consistent with the height of the UWB positioning base station of the robot (ie, the three communication devices are at the same horizontal plane) or the difference is not large.
  • the detected change parameters are brought in to quickly obtain the position point of the pet and the corresponding grid coordinates, the data processing speed is fast, and the output result is accurate.
  • the third UWB positioning base station needs to be set on the robot body, and the three-dimensional coordinates of the UWB positioning tag are determined by introducing the height parameter, thereby determining the corresponding
  • the embodiment of the present embodiment is the same as the principle of the present embodiment, and details are not described herein again.
  • the method of pet positioning by UWB communication technology has a larger positioning range, higher precision and better stability than other existing positioning methods.
  • the coordinates of the first UWB positioning base station on the determining robot body are (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), including the following steps: determining the coordinates of the center point of the robot body a coordinate of a current position point of the robot, and the coordinate is (X1, Y1); determining a center point of the robot body at a midpoint of the connection between the first UWB positioning base station and the second UWB positioning base station; determining the The distance between the center point of the UWB positioning base station and the second UWB positioning base station is W, and the distance from the center point of the robot body to the first UWB positioning base station is W/2, and the center point of the robot body to the second The distance of the UWB positioning base station is W/2; determining the current direction of the robot detected by the gyroscope of the robot is ⁇ ; determining the coordinates of the first UWB positioning base station on the robot body is (X11,
  • the second UWB is required to locate the X-axis coordinate X12 of the base station.
  • the method for determining the coordinates of the first UWB positioning base station and the second UWB positioning base station in this embodiment can simplify the coordinate algorithm for determining the two base stations by limiting the positional relationship of the base stations on the robot body with respect to the center point.
  • the third base station is set on the vertical bisector of the AB, which simplifies the algorithm and improves the data processing speed of the system.
  • the specific implementation is the same as the principle of the embodiment, and is no longer Narration.
  • the determining that the first distance of the UWB locating label to the first UWB locating base station is R1, and the second distance of the UWB locating label to the second UWB locating base station is R2, including the following steps: Determining a propagation speed of the radio wave as c; determining a time between the first UWB positioning base station sending the ranging data to the UWB positioning tag to the acknowledgment signal for receiving the UWB positioning tag is T11; determining the UWB positioning tag receiving Determining, by the first UWB, that the time between the ranging data sent by the base station and the sending of the acknowledgement signal is T12; determining that the UWB positioning tag sends ranging data to the first UWB positioning base station to receive the first UWB positioning base station.
  • the time between the signals is T21; determining that the UWB positioning tag receives the ranging data sent by the second UWB positioning base station to send an acknowledgment signal is T22; determining the UWB positioning tag to the second UWB Determining, between the time when the positioning base station sends the ranging data to the acknowledgment signal of the second UWB positioning base station, T23; determining that the second UWB positioning base station receives the ranging data sent by the UWB positioning
  • the first UWB positioning base station A sends ranging data to the UWB positioning tag C at time t1, and the UWB positioning tag C receives the ranging data at time t2 and sends an acknowledgment signal at time t3, and the first UWB locates the base station. A receives the acknowledgment signal at time t4.
  • the UWB positioning tag C sends ranging data to the first UWB positioning base station A at time t5.
  • the first UWB positioning base station A receives the ranging data at time t6 and sends an acknowledgment signal at time t7, and the UWB positioning tag C is at t8.
  • the confirmation signal is received at the moment.
  • the specific implementation is similar to this embodiment. Let me repeat.
  • the method for measuring the distance between the base station and the positioning tag in this embodiment by taking the average value of the data signal transmission time, a more accurate transmission time can be obtained, thereby obtaining a more accurate distance measurement result, and determining the pet for subsequent determination.
  • the location provides a more reliable reference to ensure better monitoring of pets.
  • the robot monitors whether the grid unit in the preset range covered by the shooting angle of the camera of the pet has an obstacle unit.
  • the method includes the following steps: determining that a direction in which the robot monitors the pet's camera toward the pet is a photographing direction; determining, according to the photographing direction, a photographing area covered by the photographing angle of the camera in the grid map; determining that the camera is an angular vertex
  • the angle unit formed by the first extending edge and the second corner is in a grid unit corresponding to the coverage area in the grid map, and analyzing whether there is an obstacle unit in the grid unit corresponding to the coverage area .
  • the coverage area is smaller than and located in the shooting area.
  • a small square in the figure indicates a grid unit, and a square marked with X indicates that the square is an obstacle unit, and a square without any mark or other letters indicates that the square has passed.
  • Point G is the position of the robot, that is, the position of the camera, and point C is the position of the pet.
  • GZ is the shooting direction, the angle formed by the two lines GB1 and GB2 is the shooting angle, and GZ is the angle bisector of the shooting angle.
  • GU1 is the first corner
  • GU2 is the second corner.
  • the method in the embodiment determines whether there is an obstacle unit between the two position points by combining the grid map, thereby determining whether there is an obstacle occlusion between the robot and the pet, and the method fully utilizes the existing data of the robot.
  • the judgment process is simple and practical, and the effect is remarkable.
  • the preset area centered on the grid unit where the pet is located is determined, and according to the distance relationship between the unit and the robot in the preset area, the distance between the unit and the robot is one by one.
  • the unit has passed through the unit as a pending monitoring unit, and includes the following steps: determining that the center of the grid unit in which the pet is located is a center of the circle, a circular area with a predetermined length as a radius; determining that the distance between the circular area and the robot is the closest.
  • the unit has passed through the unit as a pending monitoring unit; if there is an obstacle unit in the linear grid path between the pending monitoring unit and the grid unit where the pet is located, and the distance from the robot in the circular area is second closest to If the distance between the unit and the robot is not the farthest, the passing unit that is the second closest to the robot in the circular area is determined as the pending monitoring unit; if the pending monitoring unit is between the grid unit and the pet
  • the linear grid path has an obstacle unit, and the distance
  • a small square in the figure indicates a grid unit, and a square marked with X indicates that the square is an obstacle unit, and a square without a mark or other letters indicates that the square has passed.
  • Point G is the position of the robot, that is, the position of the camera, and point C is the position of the pet.
  • GZ is the shooting direction
  • GU1 is the first corner
  • GU2 is the second corner. Since there are obstacle units (ie, squares marked with X) in the range of ⁇ U1GU2, the shooting of the robot may be blocked by obstacles, so the robot needs to adjust the shooting position.
  • the center of the grid cell where the C point is located is the center of the circle, and a circle is drawn with the budget length as the radius.
  • the circle defined by the circle is the preset area.
  • the preset length may be set according to specific design requirements. Preferably, the preset length may be set to any value within a range of 1 meter to 2 meters. The embodiment is set to 1.5 meters. It should be noted that FIG. 7
  • the circular area shown is only a schematic view. The radius or diameter of the circle cannot be measured by the length of the grid cell in the figure. In addition, if the circular area only defines a part of a grid cell, the grid cell Also within the scope of the circular area.
  • the grid unit S1 is the passing unit closest to the robot in the circular area, and is first used as the pending monitoring unit, due to the straight-line grid path between S1 and C (ie, the line connecting S1 and C)
  • the path formed by the grid cells that pass through has an obstacle unit indicating X, so S1 cannot be determined as a monitoring unit.
  • the step S2 that has the second closest distance to the robot is analyzed.
  • the passing unit that is the second closest to the distance of the robot S2 is a pending monitoring unit, and there is no obstacle unit indicating X in the linear grid path between S2 and C, that is, there is no obstacle blocking the robot to photograph the pet, determining S2 as the monitoring unit, and navigating the robot to the passed unit S2, monitoring the pet.
  • the traversing unit S3, which is the third closest to the robot continues to be analyzed. The method is the same as above, and will not be described again.
  • the control robot of step 5 walks from the current position point to the monitoring unit, and monitors the pet, including the following steps: searching for the grid map in the direction of the monitoring unit starting from the current position point of the robot Determining between the current position point of the robot and the center point of the monitoring unit, in the grid path directly connected by the unit, the grid path having the shortest path length is used as the navigation grid path; determining the navigation grid path The center point of the grid unit is a navigation position point, and the navigation position points are connected to form a navigation path; the control robot walks along the navigation path from the current position point to the monitoring position point; adjusts the robot direction to make the robot The camera is oriented in the direction of the pet. As shown in Fig.
  • the robot has to travel from the G point to the monitoring unit S2, and needs to search for the walking path first.
  • the square marked with X indicates that the square is an obstacle unit, and the squares of other letters are not marked or marked.
  • the square is the unit that has passed. First, searching for the grid map in the direction in which the monitoring unit is located with the G point of the current position of the robot as a starting point, wherein searching in the direction of the monitoring unit is not limited to searching in a straight line direction toward the monitoring unit, but in the direction As a general search trend, starting from the G point, the grid is searched for directions away from the G point one by one, and the search is performed one by one from the surrounding to the monitoring unit.
  • the first one is connected to the monitoring unit from the lower left of the monitoring unit
  • the second is connected to the monitoring unit from the upper right of the monitoring unit
  • the two grid paths are separated by the obstacle unit. Since the length of the first raster path is smaller than the second raster path, the first raster path is used as the navigation raster path.
  • the center point of the grid unit in the first grid path is used as a navigation position point, and the navigation point is connected to form a navigation path, that is, a dotted line indicated by L1 (the dotted line indicated by L2 is the route of the second grid path) ).
  • the control robot starts from point G and travels along the L1 route to the center point of the monitoring unit S2 (i.e., the monitoring position point). Finally, rotate the robot body in place so that the camera's camera's shooting direction is toward the direction of point C (ie, the direction the pet is in).
  • the method described in this embodiment can quickly determine which raster paths arrive at the monitoring unit by searching the grid map in the direction in which the monitoring unit is located. By analyzing the length of each path to determine the shortest path as the navigation path, the time for the robot to reach the monitoring unit can be shortened.
  • the center point of the grid unit is used as the navigation position point, and the navigation path formed by the connection of each navigation position point is The best navigation path to the monitoring position point, the robot walks according to the navigation path, not only can shorten the time to reach the destination, but also reduce the risk that the walking process easily hits the obstacle and improve the efficiency of the robot reaching the monitoring position.
  • the side length of the grid unit shown in the diagram of the present embodiment is equal to the diameter of one robot body.
  • the chip of the present invention is used for storing a program for controlling a robot to execute the above-described grid map based robot to monitor a pet.
  • the robot can wirelessly communicate with the robot through the wireless signal device on the pet to determine the mutual positional relationship between the pet and the robot, and then determine the grid in which the robot and the pet correspond in the grid map. Whether there is an obstacle unit between the cells, to determine whether there is an obstacle between the robot and the pet. If not, it indicates that the robot's current position and shooting direction can effectively capture the pet without changing the current position and shooting direction of the robot. If the pet runs, the robot will rotate the body and keep the camera always facing the pet, in the process. The robot can walk to other locations unless obstacles are blocked.
  • the robot needs to reselect the monitoring position by judging the state of the grid unit around the pet. In the preset area around the pet, the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined. Whether there is an obstacle unit in the path, that is, whether the pet can be effectively monitored from the position of the pending monitoring unit, if there is no obstacle occlusion, it is determined that the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed.
  • the robot In the preset area around the pet, the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
  • the farthest unit that is farthest from the robot is used.
  • the distribution of obstacles is characteristic, that is, obstacles are generally concentrated in a certain area or some areas, as long as one If an obstacle unit is detected in the area, there will be other obstacle units in the area.
  • the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
  • the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
  • the chip of the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, thereby improving Monitor the effect of pets.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Animal Husbandry (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

一种基于栅格地图的机器人监视宠物的方法及芯片,通过宠物身上的无线信号装置与机器人进行无线通信,来确定宠物和机器人的相互位置关系,再判断机器人和宠物对应在栅格地图中所处的栅格单元之间是否存在障碍单元。如果没有,则表明机器人的当前位置和拍摄方向可以有效拍摄到宠物,无需改变机器人的当前位置和拍摄方向。如果有,则表明机器人在当前位置拍摄可能会拍到障碍物,拍不到宠物,所以,机器人需要通过判断宠物周围的栅格单元的状态来重新选择监视位置点。通过这种结合栅格地图对宠物进行监视的方式,可以控制机器人找到较好的监视位置,从而避免容易被障碍物遮挡而影响监视效果的问题,提高了监视宠物的效果。

Description

基于栅格地图的机器人监视宠物的方法及芯片 技术领域
本发明涉及机器人领域,具体涉及一种基于栅格地图的机器人监视宠物的方法及芯片。
背景技术
目前的宠物机器人,可以通过与宠物身上佩戴的定位装置进行通信,来确定宠物的位置。从而可以追踪宠物并通过摄像头监视宠物的状态。但是,现有机器人监视技术,不能很好的确定监视的位置,比如机器人和宠物之间有障碍物,可能就会影响监视的效果。
发明内容
为解决上述问题,本发明提供了一种基于栅格地图的机器人监视宠物的方法及芯片,可以较好地确定机器人监视宠物的位置,从而实现较好的监视效果。本发明的具体技术方案如下:
一种基于栅格地图的机器人监视宠物的方法,包括如下步骤:
步骤一,基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元;
步骤二,基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元;
步骤三,判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元;
如果否,则保持机器人的摄像头朝向宠物的拍摄方向,并返回步骤二;
如果是,进入步骤四;
步骤四,确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一 把已走过单元作为待定监视单元,判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元;
如果否,则确定所述待定监视单元为监视单元,并进入步骤五;
如果是,则判断下一个已走过单元是否与机器人的距离最远;
如果否,则返回步骤四;
如果是,则直接确定下一个已走过单元为监视单元,并进入步骤五;
步骤五,控制机器人从当前位置点行走至所述监视单元,对宠物进行监视;
其中,所述障碍单元为机器人检测到障碍物时所对应的栅格单元,所述已走过单元为机器人已行走过的栅格单元。
进一步地,步骤一所述的基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元,包括如下步骤:
根据机器人在行走过程中检测到的数据,构建一个基于(X0,Y0)为原点的XY轴坐标系的栅格地图;
确定所述栅格地图中的栅格单元的边长为L;
基于机器人自身的定位数据,确定机器人的当前位置点的坐标为(X1,Y1),则当前位置点所对应的栅格单元的栅格坐标为(S11,S12),且S11=(X1-X0)/L,S12=(Y1-Y0)/L,S11和S12都取整数。
进一步地,步骤二所述的基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元,包括如下步骤:
确定机器人机体上的第一UWB定位基站和第二UWB定位基站之间的距离为W;
确定所述第一UWB定位基站的坐标为(X11,Y11),所述第二UWB定位基站的坐标为(X12,Y12);
基于所述第一UWB定位基站和所述第二UWB定位基站与宠物身上的UWB定位标签的无线通信,确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2;
确定所述第一UWB定位基站为角顶点的分别指向所述第二UWB定位基站和所述UWB定位标签的线所构成的夹角为第一夹角,且第一夹角为α1,α1=arccos((W 2+R2 2-R1 2)/(2*W*R2));
确定所述第二UWB定位基站为角顶点的分别指向所述第一UWB定位基站和所述UWB定位标签的线所构成的夹角为第二夹角,且第二夹角为α2,α2=arccos((W 2+R1 2-R2 2)/(2*W*R1));
确定所述UWB定位标签的当前位置点的坐标为(Xc,Yc),且Xc=X12+R2*cos(180°-α1-arccos((X12-X11)/W)),Yc=Y11+R1*cos(180°-α2-arcsin((X12-X11)/W));
确定所述UWB定位标签的当前位置点所对应的栅格单元的栅格坐标为(S21,S22),且S21=(Xc-X0)/L,S22=(Yc-Y0)/L,S21和S22都取整数。
进一步地,所述确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),第二UWB定位基站的坐标为(X12,Y12),包括如下步骤:
确定机器人机体的中心点的坐标为机器人的当前位置点的坐标,且坐标为(X1,Y1)
确定机器人机体的中心点在所述第一UWB定位基站和所述第二UWB定位基站连线的中点;
确定所述第一UWB定位基站和所述第二UWB定位基站之间的距离为W,则机器人机体的中心点到所述第一UWB定位基站的距离为W/2,机器人机体的中心点到所述第二UWB定位基站的距离为W/2;
确定机器人的陀螺仪检测到的机器人的当前方向为α;
确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),且X11=X1-((W*cosα)/2),Y11=Y1+((W*sinα)/2);
确定机器人机体上的第二UWB定位基站的坐标为(X12,Y12),且X12=X1+((W*cosα)/2),Y12=Y1-((W*sinα)/2)。
进一步地,所述确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,包括如下步骤:
确定电波的传播速度为c;
确定所述第一UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T11;
确定所述UWB定位标签接收所述第一UWB定位基站发出的测距数据到发出确认信号之间的时间为T12;
确定所述UWB定位标签向所述第一UWB定位基站发出测距数据到接收所述第一UWB定位基站的确认信号之间的时间为T13;
确定所述第一UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T14;
确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,且R1=c*(T11-T12+T13-T14)/4;
确定所述第二UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T21;
确定所述UWB定位标签接收所述第二UWB定位基站发出的测距数据到发出确认信号之间的时间为T22;
确定所述UWB定位标签向所述第二UWB定位基站发出测距数据到接收所述第二UWB定位基站的确认信号之间的时间为T23;
确定所述第二UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T24;
确定所述UWB定位标签到所述第二UWB定位基站的第二距离为 R2,且R2=c*(T21-T22+T23-T24)/4。
进一步地,步骤三所述的判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元,包括如下步骤:
确定机器人监视宠物的摄像头朝向宠物的方向为拍摄方向;
基于所述拍摄方向,确定摄像头的拍摄角度在所述栅格地图中所覆盖的拍摄区域;
确定以摄像头为角顶点向外延伸的第一角边和第二角边所构成的角度范围在所述栅格地图中的覆盖区域所对应的栅格单元,其中,所述覆盖区域小于并位于所述拍摄区域内;
分析所述覆盖区域所对应的栅格单元中是否存在障碍单元。
进一步地,步骤四所述的确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,包括如下步骤:
确定宠物所处的栅格单元的中心为圆心,以预设长度为半径的圆形区域;
确定所述圆形区域中与机器人的距离最接近的已走过单元作为待定监视单元;
如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且所述圆形区域中与机器人的距离第二接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第二接近的已走过单元作为待定监视单元;
如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且圆形区域中与机器人的距离第三接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第三接近的已走过单元作为待定监视单元;
依此类推。
进一步地,所述预设长度为1米至2米的范围内任意一值。
进一步地,步骤五所述的控制机器人从当前位置点行走至所述监视单元,对宠物进行监视,包括如下步骤:
以机器人的当前位置点为起点,朝所述监视单元所在方向搜索栅格地图;
确定机器人的当前位置点与所述监视单元的中心点之间,由已走过单元直接连通的栅格路径中,路径长度最短的栅格路径作为导航栅格路径;
确定导航栅格路径中的栅格单元的中心点为导航位置点,将所述导航位置点连接构成导航路径;
控制机器人从当前位置点沿所述导航路径行走至所述监视位置点;
调整机器人方向,使所述机器人的摄像头的拍摄方向对准宠物所处的方向。
一种芯片,用于存储程序,所述程序用于控制机器人执行上述的基于栅格地图的机器人监视宠物的方法。
本发明的有益效果在于:通过宠物身上的无线信号装置与机器人进行无线通信,来确定宠物和机器人的相互位置关系,再判断机器人和宠物对应在栅格地图中所处的栅格单元之间是否存在障碍单元,以此来判断机器人和宠物之间是否有障碍物遮挡。如果没有,则表明机器人的当前位置和拍摄方向可以有效拍摄到宠物,无需改变机器人的当前位置和拍摄方向。如果有,则表明机器人在当前位置拍摄可能会拍到障碍物,拍不到宠物,所以,机器人需要通过判断宠物周围的栅格单元的状态来重新选择监视位置点。在宠物周围的预设区域内选择与机器人的距离由近到远的关系,逐一把已走过单元作为待定监视单元,然后判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元,即分析从待定监视单元的位置是否可以有效 监视宠物,如果没有障碍物遮挡,则确定该待定监视单元作为监视单元,如果有,则分析下一个已走过单元。这种在宠物周围的预设区域内,与机器人距离由近到远的逐个分析已走过单元的方式,可以找到机器人能够最快到达并能够有效监视宠物的位置点,从而提高机器人监视宠物的效率。此外,如果预设区域内的与机器人的距离最远的已走过单元之外的其它已走过单元与宠物之间都有障碍物遮挡,则不管与机器人的距离最远的已走过单元与宠物之间是否有障碍物,都将其作为监视单元,因为一般情况下,障碍物的分布是有特点的,即障碍物一般会集中出现在某个区域或者某几个区域,只要在一个区域中检测到一个障碍单元,则在该区域还会存在其它的障碍单元,如果机器人在当前位置检测到障碍物,则在一定范围内,离当前位置越远的区域,出现障碍单元的概率会越小,所以,通过把预设区域内的与机器人的距离最远的已走过单元作为监视单元,可以使机器人处于相对空旷的区域,在宠物位置发生变化时,可以更方便地调整监视位置或者监视角度,而不容易再受相邻障碍物的干扰,提高了监视效率。
综上所述,本发明通过这种结合栅格地图对宠物进行监视的方式,可以控制机器人找到较好的监视位置,从而避免容易被障碍物遮挡而影响监视效果的问题,提高了监视宠物的效果。
附图说明
图1为本发明所述基于栅格地图的机器人监视宠物的方法的流程示意图。
图2为本发明所述的位置点的坐标转换成栅格单元的坐标的分析示意图。
图3为本发明所述的两个UWB定位基站与UWB定位标签的相互位置分析示意图。
图4为本发明所述的根据机器人中心点坐标计算两个UWB定位基站的坐标的分析示意图。
图5是测量UWB定位标签到第一UWB定位基站的距离的分析示意图。
图6是确定机器人拍摄的栅格区域的分析示意图。
图7是确定监视单元的分析示意图。
图8是确定机器人从当前位置点到监视位置点的导航路径的分析示意图。
具体实施方式
下面结合附图对本发明的具体实施方式作进一步说明:
本发明所述的机器人是智能家用电器的一种,能凭借一定的人工智能,自动在某些场合自动进行行走。机器人的机体上设有各种传感器,可检测行走距离、行走角度、机体状态和障碍物等,如碰到墙壁或其他障碍物,会自行转弯,并依不同的设定,而走不同的路线,有规划地行走,还会根据行走过程中检测到的各种数据构建栅格地图。本发明所述的移动机器人包括如下结构:带有驱动轮的能够自主行走的机器人机体,机体上设有人机交互界面,机体上设有障碍检测单元。机体中部上端面设有摄像头,当然摄像头也可以设置在机体前部的上端面或者其它位置,设置在机体前部或者其它位置时,计算相关参数时需要相对于设置在中部的摄像头对应调整一下相关数值即可。机体内部设置有惯性传感器,所述惯性传感器包括加速度计和陀螺仪等,驱动轮上设有用于检测驱动轮的行走距离的里程计(一般是码盘),还设有能够处理相关传感器的参数,并能够输出控制信号到执行部件的控制模块。
本发明所述的基于栅格地图的机器人监视宠物的方法,如图1所示,包括如下步骤:步骤一,基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元;步骤二,基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和 对应的栅格单元;步骤三,判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元;如果否,则保持机器人的摄像头朝向宠物的拍摄方向,并返回步骤二;如果是,进入步骤四;步骤四,确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元;如果否,则确定所述待定监视单元为监视单元,并进入步骤五;如果是,则判断下一个已走过单元是否与机器人的距离最远;如果否,则返回步骤四;如果是,则直接确定下一个已走过单元为监视单元,并进入步骤五;步骤五,控制机器人从当前位置点行走至所述监视单元,对宠物进行监视。其中,所述栅格地图是机器人在行走过程中,依据自身的各种传感器检测到的数据所构建的以栅格单元为基本单元的地图。所述栅格单元是具有设定长度和宽度的虚拟方格,可以设置为正方形,也可以设置为长方形。优选的,本发明所述的栅格单元是边长为0.2米的正方形格子。所述无线信号装置可以采用zigbee通信模块、超声波模块、射频通信模块、UWB(超宽带)模块或wifi模块等,具体根据产品的不同需求进行相应选择。所述预设范围也可以根据产品设计的不同需求进行相应设置,优选的,设置为摄像头的拍摄角度所覆盖的全部范围的三分之一。所述预设区域也可以根据产品设计的不同需求进行相应设置,优选的,可以设置为圆形区域、方形区域或者规则的多边形区域等,面积大小一般设置在2至6平米范围内。
机器人在行走过程中,会将已行走过的栅格单元标示为已走过单元,将检测到障碍物时所对应的栅格单元标示为障碍单元,将检测到悬崖时所对应的栅格单元标示为悬崖单元,等等,并根据所标示的信息,更新栅格地图。本发明所述的方法,通过宠物身上的无线信号装 置与机器人进行无线通信,来确定宠物和机器人的相互位置关系,再判断机器人和宠物对应在栅格地图中所处的栅格单元之间是否存在障碍单元,以此来判断机器人和宠物之间是否有障碍物遮挡。如果没有,则表明机器人的当前位置和拍摄方向可以有效拍摄到宠物,无需改变机器人的当前位置和拍摄方向,如果宠物跑动,机器人会转动机体,保持摄像头始终朝向宠物的方向,在此过程中,机器人可以不用行走到其它位置,除非出现了障碍物遮挡的情况。如果有,则表明机器人在当前位置拍摄可能会拍到障碍物,拍不到宠物,所以,机器人需要通过判断宠物周围的栅格单元的状态来重新选择监视位置点。在宠物周围的预设区域内选择与机器人的距离由近到远的关系,逐一把已走过单元作为待定监视单元,然后判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元,即分析从待定监视单元的位置是否可以有效监视宠物,如果没有障碍物遮挡,则确定该待定监视单元作为监视单元,如果有,则分析下一个已走过单元。这种在宠物周围的预设区域内,与机器人距离由近到远的逐个分析已走过单元的方式,可以找到机器人能够最快到达并能够有效监视宠物的位置点,从而提高机器人监视宠物的效率。此外,如果预设区域内的与机器人的距离最远的已走过单元之外的其它已走过单元与宠物之间都有障碍物遮挡,则不管与机器人的距离最远的已走过单元与宠物之间是否有障碍物,都将其作为监视单元,因为一般情况下,障碍物的分布是有特点的,即障碍物一般会集中出现在某个区域或者某几个区域,只要在一个区域中检测到一个障碍单元,则在该区域还会存在其它的障碍单元,如果机器人在当前位置检测到障碍物,则在一定范围内,离当前位置越远的区域,出现障碍单元的概率会越小,所以,通过把预设区域内的与机器人的距离最远的已走过单元作为监视单元,可以使机器人处于相对空旷的区域,在宠物位置发生变化时,可以更方便地调整监视位置或者监视角度,而不容易再受相邻障碍物的 干扰,提高了监视效率。综上所述,本发明所述的方法通过这种结合栅格地图对宠物进行监视的方式,可以控制机器人找到较好的监视位置,从而避免容易被障碍物遮挡而影响监视效果的问题,提高了监视宠物的效果。
优选的,步骤一所述的基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元,包括如下步骤:根据机器人在行走过程中检测到的数据,构建一个基于(X0,Y0)为原点的XY轴坐标系的栅格地图;确定所述栅格地图中的栅格单元的边长为L;基于机器人自身的定位数据,确定机器人的当前位置点的坐标为(X1,Y1),则当前位置点所对应的栅格单元的栅格坐标为(S11,S12),且S11=(X1-X0)/L,S12=(Y1-Y0)/L,S11和S12都取整数。由于机器人在行走过程中,会基于自身的里程计和陀螺仪等传感器检测到的数据,记录已行走过的路径,并实时确定自身所处的位置和方向(即定位数据)。而栅格地图是以栅格单元为基本单位构成的,每个栅格单元中都包含有很多个位置点,机器人的行走是按位置点的形式进行的,即从当前位置点移动到相邻的下一个位置点。所以,在确定机器人当前所处的栅格单元的坐标时,需要将当前位置点的坐标转换成栅格单元的坐标,如图2所示,每个小方格表示一个栅格单元,边长为L=0.2米,坐标原点P的坐标为(X0=0,Y0=0),P点右上角的栅格单元的栅格坐标设置为(0,0)。当机器人位于D位置点时,检测到其坐标为(0.5,0,3),则计算得出机器人所处的栅格单元的栅格坐标为(S11=((0.5-0)/0.2),S12=((0.3-0)/0.2)),即(S11=2.5,S12=1.5),取整后得(S11=2,S12=1),所以,机器人在D位置点时,所对应的栅格单元的栅格坐标为(2,1)。本实施例所述的方法在同一个坐标系中,通过当前位置点与坐标原点的位置关系和栅格单元的边长,可以准确地计算出当前位置点所对应的栅格单元的栅格坐标,从而为后续的数据处理提供可靠数据,提高数据分析的准确性。
优选的,步骤二所述的基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元,包括如下步骤:确定机器人机体上的第一UWB定位基站和第二UWB定位基站之间的距离为W;确定所述第一UWB定位基站的坐标为(X11,Y11),所述第二UWB定位基站的坐标为(X12,Y12);基于所述第一UWB定位基站和所述第二UWB定位基站与宠物身上的UWB定位标签的无线通信,确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2;确定所述第一UWB定位基站为角顶点的分别指向所述第二UWB定位基站和所述UWB定位标签的线所构成的夹角为第一夹角,且第一夹角为α1,α1=arccos((W 2+R2 2-R1 2)/(2*W*R2));确定所述第二UWB定位基站为角顶点的分别指向所述第一UWB定位基站和所述UWB定位标签的线所构成的夹角为第二夹角,且第二夹角为α2,α2=arccos((W 2+R1 2-R2 2)/(2*W*R1));确定所述UWB定位标签的当前位置点的坐标为(Xc,Yc),且Xc=X12+R2*cos(180°-α1-arccos((X12-X11)/W)),Yc=Y11+R1*cos(180°-α2-arcsin((X12-X11)/W));确定所述UWB定位标签的当前位置点所对应的栅格单元的栅格坐标为(S21,S22),且S21=(Xc-X0)/L,S22=(Yc-Y0)/L,S21和S22都取整数。其中,UWB(Ultra Wideband)是一种超宽带无载波通信技术,UWB定位标签和UWB定位基站就是采用UWB通信技术的通信装置。如图3所示,A为第一UWB定位基站,B为第二UWB定位基站,C为UWB定位标签。第一UWB定位基站和第二UWB定位基站都装配在机器人的机体上,UWB定位标签佩戴在宠物身上。由于第一UWB定位基站和第二UWB定位基站之间的距离是在机器人设计生产时已经确定好的,所以,两者之间的距离是已知的,即AB=W,并且相关数据已经记录入系统中,其中,W的数值大小可以根据具体的产品设计进行相应设置,W的数值应该 小于机器人机体的直径。此外,测得第一UWB定位基站和UWB定位标签的距离AC=R1,第二UWB定位基站和UWB定位标签的距离BC=R2,由三角形的三边长度可以求得第一夹角(∠ABC)的大小,即α1=arccos((W 2+R2 2-R1 2)/(2*W*R2)),同理求得第二夹角(∠CAB)的大小,即α2=arccos((W 2+R1 2-R2 2)/(2*W*R1))。由于机器人可以通过里程计和陀螺仪等传感器的检测数据来确定自己的坐标位置(即机器人的中心点的坐标),所以,机器人机体上相对于中心点的位置固定的两个UWB定位基站的坐标值也是可以确定的,即第一UWB定位基站的坐标为(X11,Y11),第二UWB定位基站的坐标为(X12,Y12),具体计算方式在后续实施例中会说明。如图可知,想要确定C点的X轴坐标,需要知道c11或者c21的长度,而c11=R1*sina2,c21=R2*cosb2,∠a2=180°-α2-∠a1,∠b2=180°-α1-∠b1,∠a1=arcsin((X12-X11)/W),∠b1=arccos((X12-X11)/W),前面又已求得α1和α2的角度,所以求得c11=R1*sin(180°-α2-arcsin((X12-X11)/W)),c21=R2*cos(180°-α1-arcsin((Y11-Y12)/W)),则宠物C点的X轴坐标Xc=X12+c21=X12+R2*cos(180°-α1-arccos((X12-X11)/W)),或者Xc=X11+c11=X11+R1*sin(180°-α2-arcsin((X12-X11)/W))。同理,想要确定C点的Y轴坐标,需要知道c12或者c22的长度,而c12=R1*cosa2,c22=R2*sinb2,∠a2=180°-α2-∠a1,∠b2=180°-α1-∠b1,∠a1=arcsin((X12-X11)/W),∠b1=arccos((X12-X11)/W),前面又已求得α1和α2的角度,所以求得c12=R1*cos(180°-α2-arcsin((X12-X11)/W)),c22=R2*sin(180°-α1-arccos((X12-X11)/W))。则宠物C点的Y轴坐标Yc=Y11+c12=Y11+R1*cos(180°-α2-arcsin((X12-X11)/W)),或者Yc=Y12+c22=Y12+R2*sin(180°-α1-arccos((X12-X11)/W))。在确定了UWB定位标签的当前位置点的坐标为(Xc,Yc)后,再通过上述实施例所述的方式,计算UWB定位标签的当前位 置点所对应的栅格单元的栅格坐标(S21,S22),即S21=(Xc-X0)/L,S22=(Yc-Y0)/L,S21和S22都取整数。本实施例所述的方法适用于针对宠物佩戴的UWB定位标签的高度与机器人的UWB定位基站的高度一致(即三个通信装置处于同一水平面)或者相差不大的情况。当机器人和宠物的位置变化时,则把检测到的变化参数带入即可快速得出宠物的位置点和对应的栅格坐标,数据处理速度快,输出结果准确。此外,如果UWB定位标签的高度与UWB定位基站的高度差值较大,则需要在机器人机身上设置第三UWB定位基站,通过引入高度参数,来确定UWB定位标签的三维坐标,进而确定对应的栅格坐标,具体实施方式与本实施例原理相同,在此不再赘述。这种通过UWB通信技术进行宠物定位的方式,相对于现有的其它定位方式,定位范围更大,精度更高,稳定性更好。
优选的,所述确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),第二UWB定位基站的坐标为(X12,Y12),包括如下步骤:确定机器人机体的中心点的坐标为机器人的当前位置点的坐标,且坐标为(X1,Y1);确定机器人机体的中心点在所述第一UWB定位基站和所述第二UWB定位基站连线的中点;确定所述第一UWB定位基站和所述第二UWB定位基站之间的距离为W,则机器人机体的中心点到所述第一UWB定位基站的距离为W/2,机器人机体的中心点到所述第二UWB定位基站的距离为W/2;确定机器人的陀螺仪检测到的机器人的当前方向为α;确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),且X11=X1-((W*cosα)/2),Y11=Y1+((W*sinα)/2);确定机器人机体上的第二UWB定位基站的坐标为(X12,Y12),且X12=X1+((W*cosα)/2),Y12=Y1-((W*sinα)/2)。如图4所示,第一UWB定位基站A和第二UWB定位基站B分别设置在机器人机身的两端,AB之间的连线刚经过机器人的中心点G,并且AG=BG=W/2。由于G点的坐标为(X1,Y1),且机器人当前方向的角度为α,图中, 穿过G点且带箭头的直线表示机器人的当前方向,该直线与AB线呈直角相交,所以,可以得出∠a=∠b=∠α。要求得第一UWB定位基站的X轴坐标X11,需要先求出X11与X1之间的距离,即X1-X11=AG*cosa=(W*cosα)/2,则X11=X1-((W*cosα)/2)。要求得第一UWB定位基站的Y轴坐标Y11,需要先求出Y11与Y1之间的距离,即Y11-Y1=AG*sina=(W*sinα)/2,则Y11=Y1+((W*sinα)/2)。同理,要求得第二UWB定位基站的X轴坐标X12,需要先求出X12与X1之间的距离,即X12-X1=BG*cosb=W*cosα/2,则X12=X1+((W*cosα)/2);Y1与Y12之间的距离为Y1-Y12=GB*sinb=W*sinα/2,则Y12=Y1-((W*sinα)/2)。本实施例所述的确定第一UWB定位基站和第二UWB定位基站的坐标的方法,通过限定基站在机器人机体上相对于中心点相互对称的位置关系,可以简化确定两个基站的坐标算法,从而提高系统的数据处理速度,快速而准确地得出两个基站的坐标值,更快地为后续的其它数据处理提供参考依据。同样的,如果设置三个基站,则把第三个基站设置在AB的垂直平分线上,可以简化算法,提高系统的数据处理速度,其具体实施方式与本实施例原理相同,在此不再赘述。
优选的,所述确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,包括如下步骤:确定电波的传播速度为c;确定所述第一UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T11;确定所述UWB定位标签接收所述第一UWB定位基站发出的测距数据到发出确认信号之间的时间为T12;确定所述UWB定位标签向所述第一UWB定位基站发出测距数据到接收所述第一UWB定位基站的确认信号之间的时间为T13;确定所述第一UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T14;确定所述UWB定位标签到所述第一UWB定位基站 的第一距离为R1,且R1=c*(T11-T12+T13-T14)/4;确定所述第二UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T21;确定所述UWB定位标签接收所述第二UWB定位基站发出的测距数据到发出确认信号之间的时间为T22;确定所述UWB定位标签向所述第二UWB定位基站发出测距数据到接收所述第二UWB定位基站的确认信号之间的时间为T23;确定所述第二UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T24;确定所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,且R2=c*(T21-T22+T23-T24)/4。如图5所示,第一UWB定位基站A在t1时刻向UWB定位标签C发出测距数据,UWB定位标签C在t2时刻接收到测距数据并在t3时刻发出确认信号,第一UWB定位基站A在t4时刻接收到该确认信号。此时可以得出第一UWB定位基站A从发出测距数据到接收确认信号所需的时间是T1=t4-t1,UWB定位标签C从接收到测距数据到发出确认信号所需的时间是T2=t3-t2,由此可得,第一UWB定位基站A和UWB定位标签C之间进行一次来回通信,信号传输的时间为T1-T2=t4-t1-t3+t2。同样的,UWB定位标签C在t5时刻向第一UWB定位基站A发出测距数据,第一UWB定位基站A在t6时刻接收到测距数据并在t7时刻发出确认信号,UWB定位标签C在t8时刻接收到该确认信号。此时可以得出UWB定位标签C从发出测距数据到接收确认信号所需的时间是T3=t8-t5,第一UWB定位基站A从接收到测距数据到发出确认信号所需的时间是T4=t7-t6,由此可得,UWB定位标签C和第一UWB定位基站A之间进行一次来回通信,信号传输的时间为T3-T4=t8-t5-t7+t6。为了保证数据的准确性,取(T1-T2+T3-T4)的四分之一作为信号在UWB定位标签C和第一UWB定位基站A之间传输一次所用的时间。由于数据信号的传输速度等于电波传输的速度c,所以,通过距离=速度*时间,可以得出UWB定位标签到第一UWB定位基站的第一距离 R1=c*(T11-T12+T13-T14)/4。同理,得出UWB定位标签到第二UWB定位基站的第二距离为R2,且R2=c*(T21-T22+T23-T24)/4,具体实施方式与本实施例相似,在此不再赘述。本实施例所述的测量基站到定位标签之间的距离的方法,通过取数据信号传输时间的平均值,可以得到更准确的传输时间,从而得到更精确的距离测量结果,为后续的确定宠物的位置提供了更可靠的参考依据,保证了更好的监视宠物的效果。
优选的,步骤三所述的判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元,包括如下步骤:确定机器人监视宠物的摄像头朝向宠物的方向为拍摄方向;基于所述拍摄方向,确定摄像头的拍摄角度在所述栅格地图中所覆盖的拍摄区域;确定以摄像头为角顶点向外延伸的第一角边和第二角边所构成的角度范围在所述栅格地图中的覆盖区域所对应的栅格单元,分析所述覆盖区域所对应的栅格单元中是否存在障碍单元。其中,所述覆盖区域小于并位于所述拍摄区域内。如图6所示,图中的一个小方格表示一个栅格单元,标示有X的方格表示该方格为障碍单元,没有标示或者标示其它字母的方格表示该方格为已走过单元。G点为机器人所在的位置点,即摄像头的位置,C点为宠物所在的位置点。GZ为拍摄方向,GB1和GB2两条线所构成的角度为拍摄角度,GZ为所述拍摄角度的角平分线。GU1为第一角边,GU2为第二角边,分析GU1和GU2两条线构成的角度内的栅格单元中是否存在障碍单元,即判断∠U1GU2范围内的方格中是否有标示X的方格,如果有,则表示存在障碍单元,如果没有则表示不存在障碍单元。图中所述的∠U1GU2范围内没有障碍单元,机器人可以正常拍摄到宠物。假设在∠U1GU2范围内有X方格,则表明机器人的摄像头会被障碍物遮挡或者拍摄到的宠物离障碍物太近而影响拍摄效果,需要换另一个角度拍摄宠物。本实施例所述方法通过 结合栅格地图,判断两个位置点之间是否有障碍单元,以此确定机器人和宠物之间是否有障碍物遮挡,这种方式充分利用了机器人已有的数据,判断过程简单实用,且效果显著。
优选的,步骤四所述的确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,包括如下步骤:确定宠物所处的栅格单元的中心为圆心,以预设长度为半径的圆形区域;确定所述圆形区域中与机器人的距离最接近的已走过单元作为待定监视单元;如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且所述圆形区域中与机器人的距离第二接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第二接近的已走过单元作为待定监视单元;如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且圆形区域中与机器人的距离第三接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第三接近的已走过单元作为待定监视单元;依此类推。如图7所示,图中的一个小方格表示一个栅格单元,标示有X的方格表示该方格为障碍单元,没有标示或者标示其它字母的方格表示该方格为已走过单元。G点为机器人所在的位置点,即摄像头的位置,C点为宠物所在的位置点。GZ为拍摄方向,GU1为第一角边,GU2为第二角边。由于在∠U1GU2范围内有障碍单元(即标示有X的方格),机器人的拍摄可能会被障碍物遮挡,所以,机器人需要调整拍摄位置。首先,以C点所在的栅格单元的中心为圆心,以预算长度为半径画一个圆,该圆所圈定的范围即为所述预设区域。其中,所述预设长度可以根据具体的设计需求进行相应设置,优选的,可设置为1米至2米的范围内任意一值,本实施例设置为1.5米,需要注意的是,图7所示的圆形区域仅仅是一个示意图,不能以图中栅格单元的长度来测量圆形的半径或者直径,此外,如果 圆形区域仅圈定了一个栅格单元的一部分,则该栅格单元也在圆形区域的范围内。图中,栅格单元S1是圆形区域内离机器人的距离最近的已走过单元,先把其作为待定监视单元,由于S1与C之间的直线栅格路径(即连接S1与C的直线所穿过的栅格单元构成的路径)中有标示X的障碍单元,所以S1不能确定为监视单元。接着,分析与机器人的距离第二接近的已走过单元S2,由于S2不是圆形区域中与机器人距离最远的已走过单元,所以,将与机器人的距离第二接近的已走过单元S2作为待定监视单元,S2与C之间的直线栅格路径中没有标示X的障碍单元,即不存在阻挡机器人拍摄宠物的障碍物,确定S2为监视单元,并将机器人导航至已走过单元S2,对宠物进行监视。假设S2与C的直线栅格路径中也有障碍单元,则继续分析与机器人的距离第三接近的已走过单元S3,方法与上述相同,不再赘述。当除了S10之外的圆形区域中的所有已走过单元与机器人之间的直线栅格路径中都有障碍单元,则表明宠物处于障碍物包围的位置(比如沙发上,茶几上或者床上),此时在预设区域中就不需要考虑已走过单元和宠物之间的障碍单元,需要考虑的是与宠物位置较远且远离机器人当前位置的位置点,所以,可以直接将与机器人的距离最远的已走过单元S10作为监视单元,并将机器人导航至该监视单元对宠物进行监视。这种在宠物周围的预设区域内,与机器人距离由近到远的逐个分析已走过单元的方式,可以找到机器人能够最快到达并能够有效监视宠物的位置点,从而提高机器人监视宠物的效率。
优选的,步骤五所述的控制机器人从当前位置点行走至所述监视单元,对宠物进行监视,包括如下步骤:以机器人的当前位置点为起点,朝所述监视单元所在方向搜索栅格地图;确定机器人的当前位置点与所述监视单元的中心点之间,由已走过单元直接连通的栅格路径中,路径长度最短的栅格路径作为导航栅格路径;确定导航栅格路径中的栅格单元的中心点为导航位置点,将所述导航位置点连接构成导 航路径;控制机器人从当前位置点沿所述导航路径行走至所述监视位置点;调整机器人方向,使所述机器人的摄像头的拍摄方向对准宠物所处的方向。如图8所示,机器人要从G点行走至监视单元S2,需要先搜索行走路径,图中,标示有X的方格表示该方格为障碍单元,没有标示或者标示其它字母的方格表示该方格为已走过单元。首先,以机器人的当前位置点G点为起点,朝所述监视单元所在方向搜索栅格地图,其中,朝监视单元所在方向搜索并不是仅限于朝向监视单元的直线方向搜索,而是以该方向作为一种总的搜索趋势,从G点开始,逐个栅格单元地向四周远离G点的方向搜索,并从四周向监视单元收缩方向逐个栅格单元的搜索。然后搜索到两条栅格路径,第一条从监视单元的左下方连接至监视单元,第二条从监视单元的右上方连接至监视单元,两条栅格路径以障碍单元进行分隔。由于第一条栅格路径的长度小于第二条栅格路径,所以将第一条栅格路径作为导航栅格路径。将第一条栅格路径中的栅格单元的中心点作为导航位置点,连接所述导航位置点构成导航路径,即L1所标示的虚线(L2标示的虚线为第二条栅格路径的路线)。接着,控制机器人从G点开始,沿L1路线行走至监视单元S2的中心点(即监视位置点)。最后,原地转动机器人机身,使机器人的摄像头的拍摄方向朝向C点的方向(即宠物所在的方向)。本实施例所述方法通过朝监视单元所在方向搜索栅格地图,可以快速确定到达监视单元的栅格路径有哪些。再通过分析各路径长度来确定最短的路径作为导航路径,可以缩短机器人到达监视单元的时间,最后再将栅格单元的中心点作为导航位置点,由各导航位置点连接所构成的导航路径是到达监视位置点的最佳导航路径,机器人按该导航路径行走,不仅可以缩短到达目的地的时间,还能降低行走过程容易碰到障碍物的风险,提高机器人到达监视位置点的效率。优选的,本实施例的图中所示的栅格单元的边长等于一个机器人机身的直径。
本发明所述的芯片,用于存储程序,所述程序用于控制机器人执行上述的基于栅格地图的机器人监视宠物的方法。通过在机器人身上安装所述芯片,使得机器人可以通过宠物身上的无线信号装置与机器人进行无线通信,来确定宠物和机器人的相互位置关系,再判断机器人和宠物对应在栅格地图中所处的栅格单元之间是否存在障碍单元,以此来判断机器人和宠物之间是否有障碍物遮挡。如果没有,则表明机器人的当前位置和拍摄方向可以有效拍摄到宠物,无需改变机器人的当前位置和拍摄方向,如果宠物跑动,机器人会转动机体,保持摄像头始终朝向宠物的方向,在此过程中,机器人可以不用行走到其它位置,除非出现了障碍物遮挡的情况。如果有,则表明机器人在当前位置拍摄可能会拍到障碍物,拍不到宠物,所以,机器人需要通过判断宠物周围的栅格单元的状态来重新选择监视位置点。在宠物周围的预设区域内选择与机器人的距离由近到远的关系,逐一把已走过单元作为待定监视单元,然后判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元,即分析从待定监视单元的位置是否可以有效监视宠物,如果没有障碍物遮挡,则确定该待定监视单元作为监视单元,如果有,则分析下一个已走过单元。这种在宠物周围的预设区域内,与机器人距离由近到远的逐个分析已走过单元的方式,可以找到机器人能够最快到达并能够有效监视宠物的位置点,从而提高机器人监视宠物的效率。此外,如果预设区域内的与机器人的距离最远的已走过单元之外的其它已走过单元与宠物之间都有障碍物遮挡,则不管与机器人的距离最远的已走过单元与宠物之间是否有障碍物,都将其作为监视单元,因为一般情况下,障碍物的分布是有特点的,即障碍物一般会集中出现在某个区域或者某几个区域,只要在一个区域中检测到一个障碍单元,则在该区域还会存在其它的障碍单元,如果机器人在当前位置检测到障碍物,则在一定范围内,离当前位置越远的区域,出现障碍单元的概率会越小,所以,通过把预 设区域内的与机器人的距离最远的已走过单元作为监视单元,可以使机器人处于相对空旷的区域,在宠物位置发生变化时,可以更方便地调整监视位置或者监视角度,而不容易再受相邻障碍物的干扰,提高了监视效率。综上所述,本发明所述的芯片通过这种结合栅格地图对宠物进行监视的方式,可以控制机器人找到较好的监视位置,从而避免容易被障碍物遮挡而影响监视效果的问题,提高了监视宠物的效果。
以上实施例仅为充分公开而非限制本发明,凡基于本发明的创作主旨、未经创造性劳动的等效技术特征的替换,应当视为本申请揭露的范围。

Claims (10)

  1. 一种基于栅格地图的机器人监视宠物的方法,其特征在于,包括如下步骤:
    步骤一,基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元;
    步骤二,基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元;
    步骤三,判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元;
    如果否,则保持机器人的摄像头朝向宠物的拍摄方向,并返回步骤二;
    如果是,进入步骤四;
    步骤四,确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元;
    如果否,则确定所述待定监视单元为监视单元,并进入步骤五;
    如果是,则判断下一个已走过单元是否与机器人的距离最远;
    如果否,则返回步骤四;
    如果是,则直接确定下一个已走过单元为监视单元,并进入步骤五;
    步骤五,控制机器人从当前位置点行走至所述监视单元,对宠物进行监视;
    其中,所述障碍单元为机器人检测到障碍物时所对应的栅格单元,所述已走过单元为机器人已行走过的栅格单元。
  2. 根据权利要求1所述的方法,其特征在于,步骤一所述的基 于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元,包括如下步骤:
    根据机器人在行走过程中检测到的数据,构建一个基于(X0,Y0)为原点的XY轴坐标系的栅格地图;
    确定所述栅格地图中的栅格单元的边长为L;
    基于机器人自身的定位数据,确定机器人的当前位置点的坐标为(X1,Y1),则当前位置点所对应的栅格单元的栅格坐标为(S11,S12),且S11=(X1-X0)/L,S12=(Y1-Y0)/L,S11和S12都取整数。
  3. 根据权利要求2所述的方法,其特征在于,步骤二所述的基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元,包括如下步骤:
    确定机器人机体上的第一UWB定位基站和第二UWB定位基站之间的距离为W;
    确定所述第一UWB定位基站的坐标为(X11,Y11),所述第二UWB定位基站的坐标为(X12,Y12);
    基于所述第一UWB定位基站和所述第二UWB定位基站与宠物身上的UWB定位标签的无线通信,确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2;
    确定所述第一UWB定位基站为角顶点的分别指向所述第二UWB定位基站和所述UWB定位标签的线所构成的夹角为第一夹角,且第一夹角为α1,α1=arccos((W 2+R2 2-R1 2)/(2*W*R2));
    确定所述第二UWB定位基站为角顶点的分别指向所述第一UWB定位基站和所述UWB定位标签的线所构成的夹角为第二夹角,且第二夹角为α2,α2=arccos((W 2+R1 2-R2 2)/(2*W*R1));
    确定所述UWB定位标签的当前位置点的坐标为(Xc,Yc),且 Xc=X12+R2*cos(180°-α1-arccos((X12-X11)/W)),Yc=Y11+R1*cos(180°-α2-arcsin((X12-X11)/W));
    确定所述UWB定位标签的当前位置点所对应的栅格单元的栅格坐标为(S21,S22),且S21=(Xc-X0)/L,S22=(Yc-Y0)/L,S21和S22都取整数。
  4. 根据权利要求3所述的方法,其特征在于,所述确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),第二UWB定位基站的坐标为(X12,Y12),包括如下步骤:
    确定机器人机体的中心点的坐标为机器人的当前位置点的坐标,且坐标为(X1,Y1)
    确定机器人机体的中心点在所述第一UWB定位基站和所述第二UWB定位基站连线的中点;
    确定所述第一UWB定位基站和所述第二UWB定位基站之间的距离为W,则机器人机体的中心点到所述第一UWB定位基站的距离为W/2,机器人机体的中心点到所述第二UWB定位基站的距离为W/2;
    确定机器人的陀螺仪检测到的机器人的当前方向为α;
    确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),且X11=X1-((W*cosα)/2),Y11=Y1+((W*sinα)/2);
    确定机器人机体上的第二UWB定位基站的坐标为(X12,Y12),且X12=X1+((W*cosα)/2),Y12=Y1-((W*sinα)/2)。
  5. 根据权利要求3所述的方法,其特征在于,所述确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,包括如下步骤:
    确定电波的传播速度为c;
    确定所述第一UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T11;
    确定所述UWB定位标签接收所述第一UWB定位基站发出的测距数 据到发出确认信号之间的时间为T12;
    确定所述UWB定位标签向所述第一UWB定位基站发出测距数据到接收所述第一UWB定位基站的确认信号之间的时间为T13;
    确定所述第一UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T14;
    确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,且R1=c*(T11-T12+T13-T14)/4;
    确定所述第二UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T21;
    确定所述UWB定位标签接收所述第二UWB定位基站发出的测距数据到发出确认信号之间的时间为T22;
    确定所述UWB定位标签向所述第二UWB定位基站发出测距数据到接收所述第二UWB定位基站的确认信号之间的时间为T23;
    确定所述第二UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T24;
    确定所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,且R2=c*(T21-T22+T23-T24)/4。
  6. 根据权利要求1所述的方法,其特征在于,步骤三所述的判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元,包括如下步骤:
    确定机器人监视宠物的摄像头朝向宠物的方向为拍摄方向;
    基于所述拍摄方向,确定摄像头的拍摄角度在所述栅格地图中所覆盖的拍摄区域;
    确定以摄像头为角顶点向外延伸的第一角边和第二角边所构成的角度范围在所述栅格地图中的覆盖区域所对应的栅格单元,其中,所述覆盖区域小于并位于所述拍摄区域内;
    分析所述覆盖区域所对应的栅格单元中是否存在障碍单元。
  7. 根据权利要求1所述的方法,其特征在于,步骤四所述的确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,包括如下步骤:
    确定宠物所处的栅格单元的中心为圆心,以预设长度为半径的圆形区域;
    确定所述圆形区域中与机器人的距离最接近的已走过单元作为待定监视单元;
    如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且所述圆形区域中与机器人的距离第二接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第二接近的已走过单元作为待定监视单元;
    如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且圆形区域中与机器人的距离第三接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第三接近的已走过单元作为待定监视单元;
    依此类推。
  8. 根据权利要求7所述的方法,其特征在于,所述预设长度为1米至2米的范围内任意一值。
  9. 根据权利要求1所述的方法,其特征在于,步骤五所述的控制机器人从当前位置点行走至所述监视单元,对宠物进行监视,包括如下步骤:
    以机器人的当前位置点为起点,朝所述监视单元所在方向搜索栅格地图;
    确定机器人的当前位置点与所述监视单元的中心点之间,由已走过单元直接连通的栅格路径中,路径长度最短的栅格路径作为导航栅 格路径;
    确定导航栅格路径中的栅格单元的中心点为导航位置点,将所述导航位置点连接构成导航路径;
    控制机器人从当前位置点沿所述导航路径行走至所述监视位置点;
    调整机器人方向,使所述机器人的摄像头的拍摄方向对准宠物所处的方向。
  10. 一种芯片,用于存储程序,其特征在于,所述程序用于控制机器人执行权利要求1至9任一项所述的基于栅格地图的机器人监视宠物的方法。
PCT/CN2018/094744 2017-12-07 2018-07-06 基于栅格地图的机器人监视宠物的方法及芯片 WO2019109635A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/768,697 US11470821B2 (en) 2017-12-07 2018-07-06 Method for monitoring pet by robot based on grid map and chip
KR1020207019551A KR102320370B1 (ko) 2017-12-07 2018-07-06 격자 지도에 기반한 로봇의 애완동물 감시 방법 및 칩
JP2020531027A JP7136898B2 (ja) 2017-12-07 2018-07-06 格子地図に基づくロボットのペット監視方法及びチップ
EP18886384.9A EP3723423B1 (en) 2017-12-07 2018-07-06 Method and chip for monitoring pet on the basis of robot employing grid map

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711281586.3A CN108012326B (zh) 2017-12-07 2017-12-07 基于栅格地图的机器人监视宠物的方法及芯片
CN201711281586.3 2017-12-07

Publications (1)

Publication Number Publication Date
WO2019109635A1 true WO2019109635A1 (zh) 2019-06-13

Family

ID=62057124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/094744 WO2019109635A1 (zh) 2017-12-07 2018-07-06 基于栅格地图的机器人监视宠物的方法及芯片

Country Status (6)

Country Link
US (1) US11470821B2 (zh)
EP (1) EP3723423B1 (zh)
JP (1) JP7136898B2 (zh)
KR (1) KR102320370B1 (zh)
CN (1) CN108012326B (zh)
WO (1) WO2019109635A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212863A (zh) * 2019-07-09 2021-01-12 苏州科瓴精密机械科技有限公司 栅格地图的创建方法及创建系统
CN112947435A (zh) * 2021-02-04 2021-06-11 沈阳仪表科学研究院有限公司 一种用于爬壁机器人的导航控制方法
CN114302326A (zh) * 2021-12-24 2022-04-08 珠海优特电力科技股份有限公司 定位区域的确定方法、定位方法、装置和定位设备

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108012326B (zh) 2017-12-07 2019-06-11 珠海市一微半导体有限公司 基于栅格地图的机器人监视宠物的方法及芯片
CN108366343B (zh) * 2018-03-20 2019-08-09 珠海市一微半导体有限公司 机器人智能监视宠物的方法
US11126198B2 (en) * 2018-12-30 2021-09-21 Ubtech Robotics Corp Robot movement control method, apparatus and robot using the same
US11981032B2 (en) * 2019-03-15 2024-05-14 Omron Corporation Robot control device, method and program for a recovery after an obstruction
CN109901590B (zh) * 2019-03-30 2020-06-05 珠海市一微半导体有限公司 桌面机器人的回充控制方法
CN110488876A (zh) * 2019-08-20 2019-11-22 斯威方德(深圳)智能科技有限公司 宠物喂食的方法、装置、存储介质以及计算机设备
CN112415524A (zh) * 2019-08-23 2021-02-26 深圳市优必选科技股份有限公司 机器人及其定位导航方法和装置
CN111121754A (zh) * 2019-12-31 2020-05-08 深圳市优必选科技股份有限公司 移动机器人定位导航方法、装置、移动机器人及存储介质
CN113532436B (zh) * 2021-07-12 2024-05-10 中通服咨询设计研究院有限公司 一种室内相对位置定位方法
JP7467400B2 (ja) 2021-09-22 2024-04-15 Kddi株式会社 移動店舗管理システム、移動店舗管理装置、移動店舗管理方法及びコンピュータプログラム
CN114037807B (zh) * 2021-11-24 2023-03-28 深圳市云鼠科技开发有限公司 低内存占用的链式栅格地图构建方法、装置及计算机设备
CN114265412B (zh) * 2021-12-29 2023-10-24 深圳创维数字技术有限公司 车辆控制方法、装置、设备及计算机可读存储介质
KR102478305B1 (ko) 2022-02-04 2022-12-19 주식회사 디디케어스 웨어러블 디바이스로 수집된 정보를 기초로 반려동물의 보험금 지급을 결정하는 방법 및 관리서버
KR102597771B1 (ko) 2022-04-14 2023-11-06 주식회사 디디케어스 반려동물의 약품구매를 중개하는 방법 및 서버
CN115019167B (zh) * 2022-05-26 2023-11-07 中国电信股份有限公司 基于移动终端的融合定位方法、系统、设备及存储介质
KR102453797B1 (ko) 2022-06-16 2022-10-14 주식회사 디디케어스 효율적인 건강검진을 통한 반려동물의 건강관리 방법 및 서버
KR102500380B1 (ko) 2022-06-28 2023-02-16 주식회사 디디케어스 반려견 견주전용 대출 심사 방법 및 서버
CN115437299A (zh) * 2022-10-10 2022-12-06 北京凌天智能装备集团股份有限公司 一种伴随运输机器人行进控制方法及系统
CN115685223B (zh) * 2022-12-15 2023-03-21 深圳市智绘科技有限公司 位置识别方法、装置、电子设备及可读存储介质
CN116068956B (zh) * 2023-03-07 2023-07-14 深圳华龙讯达信息技术股份有限公司 基于plc的生产线过程监视系统及方法
KR102574972B1 (ko) 2023-05-04 2023-09-06 주식회사 디디케어스 맞춤형 반려견 사료 추천 방법 및 그 서버
KR102574975B1 (ko) 2023-05-12 2023-09-06 주식회사 디디케어스 Ai기반 반려견에 대한 건강검진대상 판별 방법 및 그 서버

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040093219A1 (en) * 2002-11-13 2004-05-13 Ho-Chul Shin Home robot using home server, and home network system having the same
CN101278654A (zh) * 2007-09-26 2008-10-08 深圳先进技术研究院 一种宠物看护机器人系统
CN106172059A (zh) * 2016-08-31 2016-12-07 长沙长泰机器人有限公司 宠物喂养机器人
CN106584472A (zh) * 2016-11-30 2017-04-26 北京贝虎机器人技术有限公司 用于控制自主移动式设备的方法及装置
CN106982741A (zh) * 2017-04-06 2017-07-28 南京三宝弘正视觉科技有限公司 一种宠物监控机器人和系统
CN108012326A (zh) * 2017-12-07 2018-05-08 珠海市微半导体有限公司 基于栅格地图的机器人监视宠物的方法及芯片

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07193742A (ja) * 1993-12-27 1995-07-28 Hitachi Ltd ビデオカメラ
NL1012137C2 (nl) * 1999-05-25 2000-11-28 Lely Res Holding Onbemand voertuig dat inzetbaar is in een stal of een weide.
JP3913186B2 (ja) 2003-03-28 2007-05-09 株式会社東芝 移動撮影機器
US20090021367A1 (en) * 2007-07-19 2009-01-22 Davies Daniel F Apparatus, system, and method for tracking animals
US20100179689A1 (en) * 2009-01-09 2010-07-15 National Taiwan University Of Science And Technology Method of teaching robotic system
US8493407B2 (en) * 2009-09-03 2013-07-23 Nokia Corporation Method and apparatus for customizing map presentations based on user interests
US11249495B2 (en) * 2012-09-19 2022-02-15 Botsitter, Llc Method and system for remote monitoring, care and maintenance of animals
KR102165437B1 (ko) * 2014-05-02 2020-10-14 한화디펜스 주식회사 이동 로봇의 경로 계획 장치
KR101583376B1 (ko) * 2014-06-05 2016-01-08 김명환 동물 휴대폰 서비스 시스템 및 방법
US9475195B2 (en) * 2014-09-12 2016-10-25 Toyota Jidosha Kabushiki Kaisha Anticipatory robot navigation
WO2016065625A1 (en) 2014-10-31 2016-05-06 SZ DJI Technology Co., Ltd. Systems and methods for walking pets
KR20170053351A (ko) * 2015-11-06 2017-05-16 삼성전자주식회사 청소 로봇 및 그 제어 방법
US10409292B2 (en) * 2015-12-10 2019-09-10 Panasonic Intellectual Property Corporation Of America Movement control method, autonomous mobile robot, and recording medium storing program
JP2017114270A (ja) 2015-12-24 2017-06-29 株式会社ナカヨ 特定ビーコン追跡機能を有する無人飛行体および追跡ビーコン発信ユニット
CN105706951B (zh) 2016-04-18 2019-03-08 宁波力芯科信息科技有限公司 一种智能宠物项圈及其实现方法
CN106125730B (zh) * 2016-07-10 2019-04-30 北京工业大学 一种基于鼠脑海马空间细胞的机器人导航地图构建方法
US10354515B2 (en) * 2016-07-21 2019-07-16 Vivint, Inc. Methods and system for providing an alarm trigger bypass
KR102559745B1 (ko) * 2016-10-13 2023-07-26 엘지전자 주식회사 공항 로봇 및 그를 포함하는 공항 로봇 시스템
CN106577345B (zh) * 2016-10-27 2023-08-29 重庆掌中花园科技有限公司 一种智能宠物互动系统
CN107368079B (zh) * 2017-08-31 2019-09-06 珠海市一微半导体有限公司 机器人清扫路径的规划方法及芯片
US11016491B1 (en) * 2018-01-26 2021-05-25 X Development Llc Trajectory planning for mobile robots
US20190286145A1 (en) * 2018-03-14 2019-09-19 Omron Adept Technologies, Inc. Method and Apparatus for Dynamic Obstacle Avoidance by Mobile Robots
CN108366343B (zh) * 2018-03-20 2019-08-09 珠海市一微半导体有限公司 机器人智能监视宠物的方法
KR102466940B1 (ko) * 2018-04-05 2022-11-14 한국전자통신연구원 로봇 주행용 위상 지도 생성 장치 및 방법
EP3627250B1 (en) * 2018-09-21 2023-12-06 Tata Consultancy Services Limited Method and system for free space detection in a cluttered environment
CN109540142B (zh) * 2018-11-27 2021-04-06 达闼科技(北京)有限公司 一种机器人定位导航的方法、装置、计算设备
CN109946715B (zh) * 2019-04-09 2021-06-25 云鲸智能科技(东莞)有限公司 探测方法、装置、移动机器人及存储介质
KR102302575B1 (ko) * 2019-07-16 2021-09-14 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN110703747B (zh) * 2019-10-09 2021-08-03 武汉大学 一种基于简化广义Voronoi图的机器人自主探索方法
KR20210056694A (ko) * 2019-11-11 2021-05-20 엘지전자 주식회사 충돌을 회피하는 방법 및 이를 구현한 로봇 및 서버
CN111006666B (zh) * 2019-11-21 2021-10-29 深圳市优必选科技股份有限公司 机器人路径规划方法、装置、存储介质和机器人
CN111024100B (zh) * 2019-12-20 2021-10-29 深圳市优必选科技股份有限公司 一种导航地图更新方法、装置、可读存储介质及机器人
US11822340B2 (en) * 2020-03-06 2023-11-21 Edda Technology, Inc. Method and system for obstacle avoidance in robot path planning using depth sensors
US11454974B2 (en) * 2020-06-29 2022-09-27 Baidu Usa Llc Method, apparatus, device, and storage medium for controlling guide robot
US20220206505A1 (en) * 2020-12-30 2022-06-30 Southeast University Geometric folding full coverage path for robot and method for generating same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040093219A1 (en) * 2002-11-13 2004-05-13 Ho-Chul Shin Home robot using home server, and home network system having the same
CN101278654A (zh) * 2007-09-26 2008-10-08 深圳先进技术研究院 一种宠物看护机器人系统
CN106172059A (zh) * 2016-08-31 2016-12-07 长沙长泰机器人有限公司 宠物喂养机器人
CN106584472A (zh) * 2016-11-30 2017-04-26 北京贝虎机器人技术有限公司 用于控制自主移动式设备的方法及装置
CN106982741A (zh) * 2017-04-06 2017-07-28 南京三宝弘正视觉科技有限公司 一种宠物监控机器人和系统
CN108012326A (zh) * 2017-12-07 2018-05-08 珠海市微半导体有限公司 基于栅格地图的机器人监视宠物的方法及芯片

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112212863A (zh) * 2019-07-09 2021-01-12 苏州科瓴精密机械科技有限公司 栅格地图的创建方法及创建系统
CN112947435A (zh) * 2021-02-04 2021-06-11 沈阳仪表科学研究院有限公司 一种用于爬壁机器人的导航控制方法
CN114302326A (zh) * 2021-12-24 2022-04-08 珠海优特电力科技股份有限公司 定位区域的确定方法、定位方法、装置和定位设备
CN114302326B (zh) * 2021-12-24 2023-05-23 珠海优特电力科技股份有限公司 定位区域的确定方法、定位方法、装置和定位设备

Also Published As

Publication number Publication date
EP3723423A4 (en) 2021-08-25
EP3723423B1 (en) 2023-09-06
KR20200096606A (ko) 2020-08-12
EP3723423A1 (en) 2020-10-14
JP2021505150A (ja) 2021-02-18
KR102320370B1 (ko) 2021-11-02
US20210169049A1 (en) 2021-06-10
JP7136898B2 (ja) 2022-09-13
CN108012326B (zh) 2019-06-11
US11470821B2 (en) 2022-10-18
EP3723423C0 (en) 2023-09-06
CN108012326A (zh) 2018-05-08

Similar Documents

Publication Publication Date Title
WO2019109635A1 (zh) 基于栅格地图的机器人监视宠物的方法及芯片
WO2019179001A1 (zh) 机器人智能监视宠物的方法
CN105115497B (zh) 一种可靠的室内移动机器人精确导航定位系统及方法
US20200037498A1 (en) Moving robot, method for controlling moving robot, and moving robot system
US20070271011A1 (en) Indoor map building apparatus, method, and medium for mobile robot
US20220161430A1 (en) Recharging Control Method of Desktop Robot
WO2019037668A1 (zh) 自移动机器人及其行走方法、显示障碍物分布的方法
CN112214015A (zh) 自移动机器人及其回充的方法、系统及计算机存储介质
US11465275B2 (en) Mobile robot and method of controlling the same and mobile robot system
KR20200015880A (ko) 스테이션 장치 및 이동 로봇 시스템
CN103472434B (zh) 一种机器人声音定位方法
WO2018228254A1 (zh) 一种移动电子设备以及该移动电子设备中的方法
CN107356902B (zh) 一种WiFi定位指纹数据自动采集方法
Cho et al. Localization of a high-speed mobile robot using global features
JP6699034B2 (ja) 自律移動ロボット
CN110231627A (zh) 基于可见光定位的服务机器人运行路径计算方法
CN104848852B (zh) 一种环形传感阵列的定位系统和方法
CN204649206U (zh) 一种环形传感阵列的定位系统
KR101339899B1 (ko) 스마트폰 기반 로봇위치 자가 측위 방법
Chen et al. Multi-Mobile Robot Localization and Navigation based on Visible Light Positioning
KR102100478B1 (ko) 복수의 자율주행 이동 로봇
Saim et al. A localization approach in a distributed multiagent environment
TW201444515A (zh) 清掃機器人及清掃機器人的定位方法
KR20220144477A (ko) 실내 자동 이동 시스템 및 이를 이용한 서빙 로봇
CN108459591A (zh) 自主移动设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18886384

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020531027

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 20207019551

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2018886384

Country of ref document: EP

Effective date: 20200707