WO2019109635A1 - 基于栅格地图的机器人监视宠物的方法及芯片 - Google Patents
基于栅格地图的机器人监视宠物的方法及芯片 Download PDFInfo
- Publication number
- WO2019109635A1 WO2019109635A1 PCT/CN2018/094744 CN2018094744W WO2019109635A1 WO 2019109635 A1 WO2019109635 A1 WO 2019109635A1 CN 2018094744 W CN2018094744 W CN 2018094744W WO 2019109635 A1 WO2019109635 A1 WO 2019109635A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- unit
- uwb positioning
- determining
- base station
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 137
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000004891 communication Methods 0.000 claims abstract description 17
- 238000012790 confirmation Methods 0.000 claims description 8
- 102220037952 rs79161998 Human genes 0.000 claims description 6
- 102220098950 rs150951102 Human genes 0.000 claims description 4
- 102220107837 rs2268147 Human genes 0.000 claims description 4
- 102220008303 rs4904 Human genes 0.000 claims description 4
- 102220097244 rs876660902 Human genes 0.000 claims description 4
- 230000004888 barrier function Effects 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 12
- 238000004458 analytical method Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 6
- 238000013461 design Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 102220103881 rs201490575 Human genes 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008054 signal transmission Effects 0.000 description 3
- 241001442234 Cosa Species 0.000 description 1
- 244000089409 Erythrina poeppigiana Species 0.000 description 1
- 235000009776 Rathbunia alamosensis Nutrition 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/028—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; AVICULTURE; APICULTURE; PISCICULTURE; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K29/00—Other apparatus for animal husbandry
- A01K29/005—Monitoring or measuring activity, e.g. detecting heat or mating
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0217—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with energy consumption, time reduction or distance reduction criteria
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
Definitions
- the invention relates to the field of robots, and in particular to a method and a chip for monitoring a pet by a robot based on a grid map.
- the present invention provides a method and a chip for monitoring a pet based on a grid map, which can better determine the position of the robot to monitor the pet, thereby achieving a better monitoring effect.
- the specific technical solutions of the present invention are as follows:
- a method for monitoring pets by a robot based on a grid map comprising the following steps:
- Step one based on the grid map constructed by the robot, determining a current position point of the robot in the grid map and a corresponding grid unit;
- Step 2 determining wireless positional relationship between the pet and the robot based on wireless communication between the robot and the wireless signal device on the pet body, and determining a current position point of the pet and a corresponding grid unit according to the mutual positional relationship;
- Step 3 determining whether there is an obstacle unit between the grid unit in the preset range covered by the shooting angle of the camera of the pet monitoring robot between the grid unit where the robot is located and the grid unit where the pet is located;
- Step 4 determining a preset area centered on the grid unit where the pet is located, according to the distance relationship between the passing unit and the robot in the preset area, the unit has passed through the unit one by one Determining a monitoring unit, determining whether there is an obstacle unit in a straight-line grid path between the to-be-determined monitoring unit and the grid unit where the pet is located;
- step five If not, determining that the pending monitoring unit is a monitoring unit, and proceeding to step five;
- step 5 If yes, directly determine the next taken unit as the monitoring unit, and proceed to step 5;
- Step 5 The control robot walks from the current position point to the monitoring unit to monitor the pet;
- the obstacle unit is a grid unit corresponding to the robot when the obstacle is detected, and the passed unit is a grid unit that the robot has traveled.
- the robot-based grid map according to step 1 determines the current position point and the corresponding grid unit of the robot in the grid map, and includes the following steps:
- the wireless communication based on the wireless signal device of the robot and the pet body described in step 2 determines the mutual positional relationship between the pet and the robot, and determines the current position point of the pet and the corresponding grid unit according to the mutual positional relationship, including The following steps:
- the first distance of the UWB locating base station to the first UWB locating base station is R1
- the second distance from the UWB positioning tag to the second UWB positioning base station is R2;
- the determining the coordinates of the first UWB positioning base station on the robot body is (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), including the following steps:
- the distance between the first UWB positioning base station and the second UWB positioning base station is W
- the distance from the center point of the robot body to the first UWB positioning base station is W/2
- the center point of the robot body is The second UWB locates the base station by a distance of W/2;
- Determining, between the first UWB positioning base station, the time when the ranging data is sent to the UWB positioning tag to the acknowledgment signal for receiving the UWB positioning tag is T11;
- Determining that the UWB positioning tag receives the ranging data sent by the first UWB positioning base station to send an acknowledgment signal is T12;
- Determining, between the UWB positioning tag, the ranging data sent by the first UWB positioning base station to the acknowledgment signal of the first UWB positioning base station is T13;
- Determining, between the second UWB positioning base station, the ranging data sent by the UWB positioning tag to the acknowledgment signal of the UWB positioning tag is T21;
- Determining that the UWB positioning tag receives the ranging data sent by the second UWB positioning base station to send an acknowledgment signal is T22;
- the robot monitors whether the grid unit within the preset range covered by the shooting angle of the camera of the pet has an obstacle unit. , including the following steps:
- step 4 determining a preset area centered on the grid unit where the pet is located, according to the distance relationship between the passing unit and the robot in the preset area from near to far The unit has been passed as the pending monitoring unit, including the following steps:
- the preset length is any value within a range of 1 meter to 2 meters.
- controlling robot in step 5 walks from the current position point to the monitoring unit to monitor the pet, and includes the following steps:
- the grid path having the shortest path length is used as the navigation grid path in the grid path directly connected by the unit;
- the beneficial effects of the present invention are: determining the mutual positional relationship between the pet and the robot by wirelessly communicating with the robot through the wireless signal device on the pet, and then determining whether the robot and the pet correspond to the grid unit in the grid map.
- the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined. Whether there is an obstacle unit in the path, that is, whether the pet can be effectively monitored from the position of the pending monitoring unit, if there is no obstacle occlusion, it is determined that the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed. In the preset area around the pet, the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
- the farthest unit that is farthest from the robot is used.
- the distribution of obstacles is characteristic, that is, obstacles are generally concentrated in a certain area or some areas, as long as one If an obstacle unit is detected in the area, there will be other obstacle units in the area.
- the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
- the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
- the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, and improving the monitoring of the pet. effect.
- FIG. 1 is a schematic flow chart of a method for monitoring a pet by a grid map based robot according to the present invention.
- FIG. 2 is a schematic diagram showing the analysis of the coordinates of the position point converted into the coordinates of the grid unit according to the present invention.
- FIG. 3 is a schematic diagram of mutual location analysis of two UWB positioning base stations and UWB positioning tags according to the present invention.
- FIG. 4 is a schematic diagram of analyzing coordinates of two UWB positioning base stations according to coordinates of a robot center point according to the present invention.
- FIG. 5 is a schematic diagram of analyzing the distance of a UWB positioning tag to a first UWB positioning base station.
- Fig. 6 is a schematic diagram showing the analysis of the grid area photographed by the robot.
- Figure 7 is a schematic diagram of the analysis of the determination monitoring unit.
- FIG. 8 is an analysis diagram for determining a navigation path of a robot from a current position point to a monitoring position point.
- the robot of the invention is a kind of intelligent household appliances, and can automatically walk in some occasions automatically with certain artificial intelligence.
- the mobile robot of the present invention comprises the following structure: a robotic body capable of autonomous walking with a driving wheel, a human-computer interaction interface is arranged on the body, and an obstacle detecting unit is arranged on the body. There is a camera on the upper end of the middle part of the body.
- the camera can also be placed on the upper end of the front part of the body or other positions. When it is set at the front of the body or other positions, it is necessary to adjust the relevant parameters when calculating the relevant parameters.
- the value can be.
- An inertial sensor is disposed inside the body, and the inertial sensor includes an accelerometer and a gyroscope.
- the driving wheel is provided with an odometer (generally a code wheel) for detecting the walking distance of the driving wheel, and is also provided with a parameter capable of processing the relevant sensor. And can output a control signal to the control module of the execution component.
- the method for monitoring pets by the grid map-based robot includes the following steps: Step 1: Based on the grid map constructed by the robot, determining the current position of the robot in the grid map And the corresponding grid unit; step two, based on the wireless communication between the robot and the wireless signal device on the pet body, determining the mutual positional relationship between the pet and the robot, and determining the current position point of the pet and the corresponding grid unit according to the mutual positional relationship Step 3: determining whether the grid unit in which the robot is located and the grid unit in which the pet is located, the robot monitors whether the grid unit within the preset range covered by the shooting angle of the camera of the pet has an obstacle unit; if not, Keeping the camera of the robot toward the shooting direction of the pet, and returning to step 2; if yes, proceeding to step 4; and step 4, determining a preset area centering on the grid unit where the pet is located, according to the preset area Has gone through the distance between the unit and the robot from the near to the
- the grid map is a map in which a grid unit is a basic unit constructed by the robot according to data detected by various sensors during the walking process.
- the grid unit is a virtual square having a set length and a width, and may be set to a square or a rectangle.
- the grid unit of the present invention is a square lattice having a side length of 0.2 meters.
- the wireless signal device may adopt a zigbee communication module, an ultrasonic module, a radio frequency communication module, a UWB (Ultra Wide Band) module or a wifi module, etc., and correspondingly select according to different needs of the product.
- the preset range may also be correspondingly set according to different requirements of the product design.
- the preset area is set to be one-third of the full range covered by the shooting angle of the camera.
- the preset area may also be correspondingly set according to different requirements of the product design.
- the preset area may be set as a circular area, a square area or a regular polygonal area, and the area size is generally set within a range of 2 to 6 square meters.
- the robot will mark the grid unit that has passed through as having passed through the unit, and the grid unit corresponding to the obstacle is marked as an obstacle unit, and the grid unit corresponding to the cliff will be detected. Marked as a cliff unit, etc., and updated the raster map based on the information indicated.
- the method of the present invention determines the mutual positional relationship between the pet and the robot by wirelessly communicating with the robot through the wireless signal device on the pet, and then determines whether the robot and the pet correspond to the grid unit in the grid map. There is an obstacle unit to determine whether there is an obstacle between the robot and the pet. If not, it indicates that the robot's current position and shooting direction can effectively capture the pet without changing the current position and shooting direction of the robot.
- the robot will rotate the body and keep the camera always facing the pet, in the process.
- the robot can walk to other locations unless obstacles are blocked. If there is, it means that the robot may shoot an obstacle at the current position and cannot capture the pet. Therefore, the robot needs to reselect the monitoring position by judging the state of the grid unit around the pet. In the preset area around the pet, the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined.
- the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed.
- the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
- the farthest unit that is farthest from the robot is used.
- the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
- the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
- the method of the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, thereby improving Monitor the effect of pets.
- the robot-based grid map according to the first step determines the current position point of the robot in the grid map and the corresponding grid unit, and includes the following steps: according to the data detected by the robot during the walking process. Constructing a grid map of the XY coordinate system based on (X0, Y0) as the origin; determining the side length of the grid unit in the grid map as L; determining the current position of the robot based on the positioning data of the robot itself
- the coordinates of the point are (X1, Y1)
- the robot Since the robot is walking, it records the path that has been traveled based on the data detected by its own odometer and gyroscope, and determines its position and direction (ie, positioning data) in real time.
- the grid map is composed of grid cells as the basic unit. Each grid cell contains a lot of position points.
- the robot's walking is in the form of position points, that is, moving from the current position point to the adjacent point. The next location point. Therefore, when determining the coordinates of the grid unit in which the robot is currently located, it is necessary to convert the coordinates of the current position point into the coordinates of the grid unit, as shown in FIG.
- each small square represents a grid unit, and the side length
- the method in the same coordinate system can accurately calculate the grid coordinates of the grid unit corresponding to the current position point by the positional relationship between the current position point and the coordinate origin and the side length of the grid unit. To provide reliable data for subsequent data processing and improve the accuracy of data analysis.
- the wireless communication based on the wireless signal device on the robot and the pet body in step 2 determines the mutual positional relationship between the pet and the robot, and determines the current position point of the pet and the corresponding grid unit according to the mutual positional relationship, including The following steps: determining that the distance between the first UWB positioning base station and the second UWB positioning base station on the robot body is W; determining that the coordinates of the first UWB positioning base station are (X11, Y11), and the second UWB positioning base station The coordinates of (X12, Y12) are determined based on the wireless communication between the first UWB positioning base station and the second UWB positioning base station and the UWB positioning tag on the pet, and determining the UWB positioning tag to the first UWB positioning base station
- the first distance is R1
- the second distance of the UWB positioning tag to the second UWB positioning base station is R2
- the first UWB positioning base station is determined to be an angular vertice pointing to the second UWB
- UWB Ultra Wideband
- UWB positioning tag and UWB positioning base station is a communication device using UWB communication technology.
- A is a first UWB positioning base station
- B is a second UWB positioning base station
- C is a UWB positioning tag.
- the value of W should be smaller than the diameter of the robot body.
- the first angle can be obtained from the three sides of the triangle ( ⁇ ABC)
- the robot can determine its own coordinate position (ie, the coordinates of the center point of the robot) through the detection data of the sensors such as the odometer and the gyroscope, the coordinates of the two UWB positioning base stations on the robot body with respect to the position of the center point are fixed.
- the value is also determinable, that is, the coordinates of the first UWB positioning base station are (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), and the specific calculation manner will be described in the following embodiments.
- the grid coordinates of the grid unit corresponding to the current position point of the UWB positioning tag are calculated by the manner described in the foregoing embodiment (S21).
- S21 (Xc-X0)/L
- S22 (Yc-Y0)/L
- both S21 and S22 take an integer.
- the method described in this embodiment is applicable to the case where the height of the UWB positioning tag worn by the pet is consistent with the height of the UWB positioning base station of the robot (ie, the three communication devices are at the same horizontal plane) or the difference is not large.
- the detected change parameters are brought in to quickly obtain the position point of the pet and the corresponding grid coordinates, the data processing speed is fast, and the output result is accurate.
- the third UWB positioning base station needs to be set on the robot body, and the three-dimensional coordinates of the UWB positioning tag are determined by introducing the height parameter, thereby determining the corresponding
- the embodiment of the present embodiment is the same as the principle of the present embodiment, and details are not described herein again.
- the method of pet positioning by UWB communication technology has a larger positioning range, higher precision and better stability than other existing positioning methods.
- the coordinates of the first UWB positioning base station on the determining robot body are (X11, Y11), and the coordinates of the second UWB positioning base station are (X12, Y12), including the following steps: determining the coordinates of the center point of the robot body a coordinate of a current position point of the robot, and the coordinate is (X1, Y1); determining a center point of the robot body at a midpoint of the connection between the first UWB positioning base station and the second UWB positioning base station; determining the The distance between the center point of the UWB positioning base station and the second UWB positioning base station is W, and the distance from the center point of the robot body to the first UWB positioning base station is W/2, and the center point of the robot body to the second The distance of the UWB positioning base station is W/2; determining the current direction of the robot detected by the gyroscope of the robot is ⁇ ; determining the coordinates of the first UWB positioning base station on the robot body is (X11,
- the second UWB is required to locate the X-axis coordinate X12 of the base station.
- the method for determining the coordinates of the first UWB positioning base station and the second UWB positioning base station in this embodiment can simplify the coordinate algorithm for determining the two base stations by limiting the positional relationship of the base stations on the robot body with respect to the center point.
- the third base station is set on the vertical bisector of the AB, which simplifies the algorithm and improves the data processing speed of the system.
- the specific implementation is the same as the principle of the embodiment, and is no longer Narration.
- the determining that the first distance of the UWB locating label to the first UWB locating base station is R1, and the second distance of the UWB locating label to the second UWB locating base station is R2, including the following steps: Determining a propagation speed of the radio wave as c; determining a time between the first UWB positioning base station sending the ranging data to the UWB positioning tag to the acknowledgment signal for receiving the UWB positioning tag is T11; determining the UWB positioning tag receiving Determining, by the first UWB, that the time between the ranging data sent by the base station and the sending of the acknowledgement signal is T12; determining that the UWB positioning tag sends ranging data to the first UWB positioning base station to receive the first UWB positioning base station.
- the time between the signals is T21; determining that the UWB positioning tag receives the ranging data sent by the second UWB positioning base station to send an acknowledgment signal is T22; determining the UWB positioning tag to the second UWB Determining, between the time when the positioning base station sends the ranging data to the acknowledgment signal of the second UWB positioning base station, T23; determining that the second UWB positioning base station receives the ranging data sent by the UWB positioning
- the first UWB positioning base station A sends ranging data to the UWB positioning tag C at time t1, and the UWB positioning tag C receives the ranging data at time t2 and sends an acknowledgment signal at time t3, and the first UWB locates the base station. A receives the acknowledgment signal at time t4.
- the UWB positioning tag C sends ranging data to the first UWB positioning base station A at time t5.
- the first UWB positioning base station A receives the ranging data at time t6 and sends an acknowledgment signal at time t7, and the UWB positioning tag C is at t8.
- the confirmation signal is received at the moment.
- the specific implementation is similar to this embodiment. Let me repeat.
- the method for measuring the distance between the base station and the positioning tag in this embodiment by taking the average value of the data signal transmission time, a more accurate transmission time can be obtained, thereby obtaining a more accurate distance measurement result, and determining the pet for subsequent determination.
- the location provides a more reliable reference to ensure better monitoring of pets.
- the robot monitors whether the grid unit in the preset range covered by the shooting angle of the camera of the pet has an obstacle unit.
- the method includes the following steps: determining that a direction in which the robot monitors the pet's camera toward the pet is a photographing direction; determining, according to the photographing direction, a photographing area covered by the photographing angle of the camera in the grid map; determining that the camera is an angular vertex
- the angle unit formed by the first extending edge and the second corner is in a grid unit corresponding to the coverage area in the grid map, and analyzing whether there is an obstacle unit in the grid unit corresponding to the coverage area .
- the coverage area is smaller than and located in the shooting area.
- a small square in the figure indicates a grid unit, and a square marked with X indicates that the square is an obstacle unit, and a square without any mark or other letters indicates that the square has passed.
- Point G is the position of the robot, that is, the position of the camera, and point C is the position of the pet.
- GZ is the shooting direction, the angle formed by the two lines GB1 and GB2 is the shooting angle, and GZ is the angle bisector of the shooting angle.
- GU1 is the first corner
- GU2 is the second corner.
- the method in the embodiment determines whether there is an obstacle unit between the two position points by combining the grid map, thereby determining whether there is an obstacle occlusion between the robot and the pet, and the method fully utilizes the existing data of the robot.
- the judgment process is simple and practical, and the effect is remarkable.
- the preset area centered on the grid unit where the pet is located is determined, and according to the distance relationship between the unit and the robot in the preset area, the distance between the unit and the robot is one by one.
- the unit has passed through the unit as a pending monitoring unit, and includes the following steps: determining that the center of the grid unit in which the pet is located is a center of the circle, a circular area with a predetermined length as a radius; determining that the distance between the circular area and the robot is the closest.
- the unit has passed through the unit as a pending monitoring unit; if there is an obstacle unit in the linear grid path between the pending monitoring unit and the grid unit where the pet is located, and the distance from the robot in the circular area is second closest to If the distance between the unit and the robot is not the farthest, the passing unit that is the second closest to the robot in the circular area is determined as the pending monitoring unit; if the pending monitoring unit is between the grid unit and the pet
- the linear grid path has an obstacle unit, and the distance
- a small square in the figure indicates a grid unit, and a square marked with X indicates that the square is an obstacle unit, and a square without a mark or other letters indicates that the square has passed.
- Point G is the position of the robot, that is, the position of the camera, and point C is the position of the pet.
- GZ is the shooting direction
- GU1 is the first corner
- GU2 is the second corner. Since there are obstacle units (ie, squares marked with X) in the range of ⁇ U1GU2, the shooting of the robot may be blocked by obstacles, so the robot needs to adjust the shooting position.
- the center of the grid cell where the C point is located is the center of the circle, and a circle is drawn with the budget length as the radius.
- the circle defined by the circle is the preset area.
- the preset length may be set according to specific design requirements. Preferably, the preset length may be set to any value within a range of 1 meter to 2 meters. The embodiment is set to 1.5 meters. It should be noted that FIG. 7
- the circular area shown is only a schematic view. The radius or diameter of the circle cannot be measured by the length of the grid cell in the figure. In addition, if the circular area only defines a part of a grid cell, the grid cell Also within the scope of the circular area.
- the grid unit S1 is the passing unit closest to the robot in the circular area, and is first used as the pending monitoring unit, due to the straight-line grid path between S1 and C (ie, the line connecting S1 and C)
- the path formed by the grid cells that pass through has an obstacle unit indicating X, so S1 cannot be determined as a monitoring unit.
- the step S2 that has the second closest distance to the robot is analyzed.
- the passing unit that is the second closest to the distance of the robot S2 is a pending monitoring unit, and there is no obstacle unit indicating X in the linear grid path between S2 and C, that is, there is no obstacle blocking the robot to photograph the pet, determining S2 as the monitoring unit, and navigating the robot to the passed unit S2, monitoring the pet.
- the traversing unit S3, which is the third closest to the robot continues to be analyzed. The method is the same as above, and will not be described again.
- the control robot of step 5 walks from the current position point to the monitoring unit, and monitors the pet, including the following steps: searching for the grid map in the direction of the monitoring unit starting from the current position point of the robot Determining between the current position point of the robot and the center point of the monitoring unit, in the grid path directly connected by the unit, the grid path having the shortest path length is used as the navigation grid path; determining the navigation grid path The center point of the grid unit is a navigation position point, and the navigation position points are connected to form a navigation path; the control robot walks along the navigation path from the current position point to the monitoring position point; adjusts the robot direction to make the robot The camera is oriented in the direction of the pet. As shown in Fig.
- the robot has to travel from the G point to the monitoring unit S2, and needs to search for the walking path first.
- the square marked with X indicates that the square is an obstacle unit, and the squares of other letters are not marked or marked.
- the square is the unit that has passed. First, searching for the grid map in the direction in which the monitoring unit is located with the G point of the current position of the robot as a starting point, wherein searching in the direction of the monitoring unit is not limited to searching in a straight line direction toward the monitoring unit, but in the direction As a general search trend, starting from the G point, the grid is searched for directions away from the G point one by one, and the search is performed one by one from the surrounding to the monitoring unit.
- the first one is connected to the monitoring unit from the lower left of the monitoring unit
- the second is connected to the monitoring unit from the upper right of the monitoring unit
- the two grid paths are separated by the obstacle unit. Since the length of the first raster path is smaller than the second raster path, the first raster path is used as the navigation raster path.
- the center point of the grid unit in the first grid path is used as a navigation position point, and the navigation point is connected to form a navigation path, that is, a dotted line indicated by L1 (the dotted line indicated by L2 is the route of the second grid path) ).
- the control robot starts from point G and travels along the L1 route to the center point of the monitoring unit S2 (i.e., the monitoring position point). Finally, rotate the robot body in place so that the camera's camera's shooting direction is toward the direction of point C (ie, the direction the pet is in).
- the method described in this embodiment can quickly determine which raster paths arrive at the monitoring unit by searching the grid map in the direction in which the monitoring unit is located. By analyzing the length of each path to determine the shortest path as the navigation path, the time for the robot to reach the monitoring unit can be shortened.
- the center point of the grid unit is used as the navigation position point, and the navigation path formed by the connection of each navigation position point is The best navigation path to the monitoring position point, the robot walks according to the navigation path, not only can shorten the time to reach the destination, but also reduce the risk that the walking process easily hits the obstacle and improve the efficiency of the robot reaching the monitoring position.
- the side length of the grid unit shown in the diagram of the present embodiment is equal to the diameter of one robot body.
- the chip of the present invention is used for storing a program for controlling a robot to execute the above-described grid map based robot to monitor a pet.
- the robot can wirelessly communicate with the robot through the wireless signal device on the pet to determine the mutual positional relationship between the pet and the robot, and then determine the grid in which the robot and the pet correspond in the grid map. Whether there is an obstacle unit between the cells, to determine whether there is an obstacle between the robot and the pet. If not, it indicates that the robot's current position and shooting direction can effectively capture the pet without changing the current position and shooting direction of the robot. If the pet runs, the robot will rotate the body and keep the camera always facing the pet, in the process. The robot can walk to other locations unless obstacles are blocked.
- the robot needs to reselect the monitoring position by judging the state of the grid unit around the pet. In the preset area around the pet, the distance from the robot is selected from near to far, one by one has passed the unit as the pending monitoring unit, and then the straight line grid between the pending monitoring unit and the grid unit where the pet is located is determined. Whether there is an obstacle unit in the path, that is, whether the pet can be effectively monitored from the position of the pending monitoring unit, if there is no obstacle occlusion, it is determined that the pending monitoring unit is used as the monitoring unit, and if so, the next taken unit is analyzed.
- the robot In the preset area around the pet, the distance from the robot to the far-to-far analysis of the distance has passed through the unit, and the robot can quickly find and can effectively monitor the position of the pet, thereby improving the robot monitoring pet. effectiveness.
- the farthest unit that is farthest from the robot is used.
- the distribution of obstacles is characteristic, that is, obstacles are generally concentrated in a certain area or some areas, as long as one If an obstacle unit is detected in the area, there will be other obstacle units in the area.
- the robot detects an obstacle at the current position, the probability that the obstacle unit will appear in the area farther away from the current position within a certain range will be The smaller, the robot can be placed in a relatively empty area by using the passing unit that is farthest from the robot in the preset area as the monitoring unit.
- the monitoring position can be more conveniently adjusted. Or monitor the angle, and it is not easy to be disturbed by adjacent obstacles, which improves the monitoring efficiency.
- the chip of the present invention can control the robot to find a better monitoring position by means of the method of monitoring the pet by combining the grid map, thereby avoiding the problem that the obstacle is easily blocked by the obstacle and affecting the monitoring effect, thereby improving Monitor the effect of pets.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Animal Husbandry (AREA)
- Biophysics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
Description
Claims (10)
- 一种基于栅格地图的机器人监视宠物的方法,其特征在于,包括如下步骤:步骤一,基于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元;步骤二,基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元;步骤三,判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元;如果否,则保持机器人的摄像头朝向宠物的拍摄方向,并返回步骤二;如果是,进入步骤四;步骤四,确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,判断所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中是否有障碍单元;如果否,则确定所述待定监视单元为监视单元,并进入步骤五;如果是,则判断下一个已走过单元是否与机器人的距离最远;如果否,则返回步骤四;如果是,则直接确定下一个已走过单元为监视单元,并进入步骤五;步骤五,控制机器人从当前位置点行走至所述监视单元,对宠物进行监视;其中,所述障碍单元为机器人检测到障碍物时所对应的栅格单元,所述已走过单元为机器人已行走过的栅格单元。
- 根据权利要求1所述的方法,其特征在于,步骤一所述的基 于机器人构建的栅格地图,确定机器人处于所述栅格地图中的当前位置点和对应的栅格单元,包括如下步骤:根据机器人在行走过程中检测到的数据,构建一个基于(X0,Y0)为原点的XY轴坐标系的栅格地图;确定所述栅格地图中的栅格单元的边长为L;基于机器人自身的定位数据,确定机器人的当前位置点的坐标为(X1,Y1),则当前位置点所对应的栅格单元的栅格坐标为(S11,S12),且S11=(X1-X0)/L,S12=(Y1-Y0)/L,S11和S12都取整数。
- 根据权利要求2所述的方法,其特征在于,步骤二所述的基于机器人和宠物身上的无线信号装置进行的无线通信,确定宠物与机器人的相互位置关系,并根据相互位置关系确定宠物的当前位置点和对应的栅格单元,包括如下步骤:确定机器人机体上的第一UWB定位基站和第二UWB定位基站之间的距离为W;确定所述第一UWB定位基站的坐标为(X11,Y11),所述第二UWB定位基站的坐标为(X12,Y12);基于所述第一UWB定位基站和所述第二UWB定位基站与宠物身上的UWB定位标签的无线通信,确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2;确定所述第一UWB定位基站为角顶点的分别指向所述第二UWB定位基站和所述UWB定位标签的线所构成的夹角为第一夹角,且第一夹角为α1,α1=arccos((W 2+R2 2-R1 2)/(2*W*R2));确定所述第二UWB定位基站为角顶点的分别指向所述第一UWB定位基站和所述UWB定位标签的线所构成的夹角为第二夹角,且第二夹角为α2,α2=arccos((W 2+R1 2-R2 2)/(2*W*R1));确定所述UWB定位标签的当前位置点的坐标为(Xc,Yc),且 Xc=X12+R2*cos(180°-α1-arccos((X12-X11)/W)),Yc=Y11+R1*cos(180°-α2-arcsin((X12-X11)/W));确定所述UWB定位标签的当前位置点所对应的栅格单元的栅格坐标为(S21,S22),且S21=(Xc-X0)/L,S22=(Yc-Y0)/L,S21和S22都取整数。
- 根据权利要求3所述的方法,其特征在于,所述确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),第二UWB定位基站的坐标为(X12,Y12),包括如下步骤:确定机器人机体的中心点的坐标为机器人的当前位置点的坐标,且坐标为(X1,Y1)确定机器人机体的中心点在所述第一UWB定位基站和所述第二UWB定位基站连线的中点;确定所述第一UWB定位基站和所述第二UWB定位基站之间的距离为W,则机器人机体的中心点到所述第一UWB定位基站的距离为W/2,机器人机体的中心点到所述第二UWB定位基站的距离为W/2;确定机器人的陀螺仪检测到的机器人的当前方向为α;确定机器人机体上的第一UWB定位基站的坐标为(X11,Y11),且X11=X1-((W*cosα)/2),Y11=Y1+((W*sinα)/2);确定机器人机体上的第二UWB定位基站的坐标为(X12,Y12),且X12=X1+((W*cosα)/2),Y12=Y1-((W*sinα)/2)。
- 根据权利要求3所述的方法,其特征在于,所述确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,包括如下步骤:确定电波的传播速度为c;确定所述第一UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T11;确定所述UWB定位标签接收所述第一UWB定位基站发出的测距数 据到发出确认信号之间的时间为T12;确定所述UWB定位标签向所述第一UWB定位基站发出测距数据到接收所述第一UWB定位基站的确认信号之间的时间为T13;确定所述第一UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T14;确定所述UWB定位标签到所述第一UWB定位基站的第一距离为R1,且R1=c*(T11-T12+T13-T14)/4;确定所述第二UWB定位基站向所述UWB定位标签发出测距数据到接收所述UWB定位标签的确认信号之间的时间为T21;确定所述UWB定位标签接收所述第二UWB定位基站发出的测距数据到发出确认信号之间的时间为T22;确定所述UWB定位标签向所述第二UWB定位基站发出测距数据到接收所述第二UWB定位基站的确认信号之间的时间为T23;确定所述第二UWB定位基站接收所述UWB定位标签发出的测距数据到发出确认信号之间的时间为T24;确定所述UWB定位标签到所述第二UWB定位基站的第二距离为R2,且R2=c*(T21-T22+T23-T24)/4。
- 根据权利要求1所述的方法,其特征在于,步骤三所述的判断机器人所处的栅格单元与宠物所处的栅格单元之间,机器人监视宠物的摄像头的拍摄角度所覆盖的预设范围内的栅格单元是否存在障碍单元,包括如下步骤:确定机器人监视宠物的摄像头朝向宠物的方向为拍摄方向;基于所述拍摄方向,确定摄像头的拍摄角度在所述栅格地图中所覆盖的拍摄区域;确定以摄像头为角顶点向外延伸的第一角边和第二角边所构成的角度范围在所述栅格地图中的覆盖区域所对应的栅格单元,其中,所述覆盖区域小于并位于所述拍摄区域内;分析所述覆盖区域所对应的栅格单元中是否存在障碍单元。
- 根据权利要求1所述的方法,其特征在于,步骤四所述的确定以宠物所处的栅格单元为中心点的预设区域,根据所述预设区域中的已走过单元与机器人的由近到远的距离关系,逐一把已走过单元作为待定监视单元,包括如下步骤:确定宠物所处的栅格单元的中心为圆心,以预设长度为半径的圆形区域;确定所述圆形区域中与机器人的距离最接近的已走过单元作为待定监视单元;如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且所述圆形区域中与机器人的距离第二接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第二接近的已走过单元作为待定监视单元;如果所述待定监视单元与宠物所在栅格单元之间的直线栅格路径中有障碍单元,且圆形区域中与机器人的距离第三接近的已走过单元与机器人的距离不是最远,则确定所述圆形区域中与机器人的距离第三接近的已走过单元作为待定监视单元;依此类推。
- 根据权利要求7所述的方法,其特征在于,所述预设长度为1米至2米的范围内任意一值。
- 根据权利要求1所述的方法,其特征在于,步骤五所述的控制机器人从当前位置点行走至所述监视单元,对宠物进行监视,包括如下步骤:以机器人的当前位置点为起点,朝所述监视单元所在方向搜索栅格地图;确定机器人的当前位置点与所述监视单元的中心点之间,由已走过单元直接连通的栅格路径中,路径长度最短的栅格路径作为导航栅 格路径;确定导航栅格路径中的栅格单元的中心点为导航位置点,将所述导航位置点连接构成导航路径;控制机器人从当前位置点沿所述导航路径行走至所述监视位置点;调整机器人方向,使所述机器人的摄像头的拍摄方向对准宠物所处的方向。
- 一种芯片,用于存储程序,其特征在于,所述程序用于控制机器人执行权利要求1至9任一项所述的基于栅格地图的机器人监视宠物的方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/768,697 US11470821B2 (en) | 2017-12-07 | 2018-07-06 | Method for monitoring pet by robot based on grid map and chip |
KR1020207019551A KR102320370B1 (ko) | 2017-12-07 | 2018-07-06 | 격자 지도에 기반한 로봇의 애완동물 감시 방법 및 칩 |
JP2020531027A JP7136898B2 (ja) | 2017-12-07 | 2018-07-06 | 格子地図に基づくロボットのペット監視方法及びチップ |
EP18886384.9A EP3723423B1 (en) | 2017-12-07 | 2018-07-06 | Method and chip for monitoring pet on the basis of robot employing grid map |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711281586.3A CN108012326B (zh) | 2017-12-07 | 2017-12-07 | 基于栅格地图的机器人监视宠物的方法及芯片 |
CN201711281586.3 | 2017-12-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019109635A1 true WO2019109635A1 (zh) | 2019-06-13 |
Family
ID=62057124
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/094744 WO2019109635A1 (zh) | 2017-12-07 | 2018-07-06 | 基于栅格地图的机器人监视宠物的方法及芯片 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11470821B2 (zh) |
EP (1) | EP3723423B1 (zh) |
JP (1) | JP7136898B2 (zh) |
KR (1) | KR102320370B1 (zh) |
CN (1) | CN108012326B (zh) |
WO (1) | WO2019109635A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112212863A (zh) * | 2019-07-09 | 2021-01-12 | 苏州科瓴精密机械科技有限公司 | 栅格地图的创建方法及创建系统 |
CN112947435A (zh) * | 2021-02-04 | 2021-06-11 | 沈阳仪表科学研究院有限公司 | 一种用于爬壁机器人的导航控制方法 |
CN114302326A (zh) * | 2021-12-24 | 2022-04-08 | 珠海优特电力科技股份有限公司 | 定位区域的确定方法、定位方法、装置和定位设备 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108012326B (zh) | 2017-12-07 | 2019-06-11 | 珠海市一微半导体有限公司 | 基于栅格地图的机器人监视宠物的方法及芯片 |
CN108366343B (zh) * | 2018-03-20 | 2019-08-09 | 珠海市一微半导体有限公司 | 机器人智能监视宠物的方法 |
US11126198B2 (en) * | 2018-12-30 | 2021-09-21 | Ubtech Robotics Corp | Robot movement control method, apparatus and robot using the same |
US11981032B2 (en) * | 2019-03-15 | 2024-05-14 | Omron Corporation | Robot control device, method and program for a recovery after an obstruction |
CN109901590B (zh) * | 2019-03-30 | 2020-06-05 | 珠海市一微半导体有限公司 | 桌面机器人的回充控制方法 |
CN110488876A (zh) * | 2019-08-20 | 2019-11-22 | 斯威方德(深圳)智能科技有限公司 | 宠物喂食的方法、装置、存储介质以及计算机设备 |
CN112415524A (zh) * | 2019-08-23 | 2021-02-26 | 深圳市优必选科技股份有限公司 | 机器人及其定位导航方法和装置 |
CN111121754A (zh) * | 2019-12-31 | 2020-05-08 | 深圳市优必选科技股份有限公司 | 移动机器人定位导航方法、装置、移动机器人及存储介质 |
CN113532436B (zh) * | 2021-07-12 | 2024-05-10 | 中通服咨询设计研究院有限公司 | 一种室内相对位置定位方法 |
JP7467400B2 (ja) | 2021-09-22 | 2024-04-15 | Kddi株式会社 | 移動店舗管理システム、移動店舗管理装置、移動店舗管理方法及びコンピュータプログラム |
CN114037807B (zh) * | 2021-11-24 | 2023-03-28 | 深圳市云鼠科技开发有限公司 | 低内存占用的链式栅格地图构建方法、装置及计算机设备 |
CN114265412B (zh) * | 2021-12-29 | 2023-10-24 | 深圳创维数字技术有限公司 | 车辆控制方法、装置、设备及计算机可读存储介质 |
KR102478305B1 (ko) | 2022-02-04 | 2022-12-19 | 주식회사 디디케어스 | 웨어러블 디바이스로 수집된 정보를 기초로 반려동물의 보험금 지급을 결정하는 방법 및 관리서버 |
KR102597771B1 (ko) | 2022-04-14 | 2023-11-06 | 주식회사 디디케어스 | 반려동물의 약품구매를 중개하는 방법 및 서버 |
CN115019167B (zh) * | 2022-05-26 | 2023-11-07 | 中国电信股份有限公司 | 基于移动终端的融合定位方法、系统、设备及存储介质 |
KR102453797B1 (ko) | 2022-06-16 | 2022-10-14 | 주식회사 디디케어스 | 효율적인 건강검진을 통한 반려동물의 건강관리 방법 및 서버 |
KR102500380B1 (ko) | 2022-06-28 | 2023-02-16 | 주식회사 디디케어스 | 반려견 견주전용 대출 심사 방법 및 서버 |
CN115437299A (zh) * | 2022-10-10 | 2022-12-06 | 北京凌天智能装备集团股份有限公司 | 一种伴随运输机器人行进控制方法及系统 |
CN115685223B (zh) * | 2022-12-15 | 2023-03-21 | 深圳市智绘科技有限公司 | 位置识别方法、装置、电子设备及可读存储介质 |
CN116068956B (zh) * | 2023-03-07 | 2023-07-14 | 深圳华龙讯达信息技术股份有限公司 | 基于plc的生产线过程监视系统及方法 |
KR102574972B1 (ko) | 2023-05-04 | 2023-09-06 | 주식회사 디디케어스 | 맞춤형 반려견 사료 추천 방법 및 그 서버 |
KR102574975B1 (ko) | 2023-05-12 | 2023-09-06 | 주식회사 디디케어스 | Ai기반 반려견에 대한 건강검진대상 판별 방법 및 그 서버 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040093219A1 (en) * | 2002-11-13 | 2004-05-13 | Ho-Chul Shin | Home robot using home server, and home network system having the same |
CN101278654A (zh) * | 2007-09-26 | 2008-10-08 | 深圳先进技术研究院 | 一种宠物看护机器人系统 |
CN106172059A (zh) * | 2016-08-31 | 2016-12-07 | 长沙长泰机器人有限公司 | 宠物喂养机器人 |
CN106584472A (zh) * | 2016-11-30 | 2017-04-26 | 北京贝虎机器人技术有限公司 | 用于控制自主移动式设备的方法及装置 |
CN106982741A (zh) * | 2017-04-06 | 2017-07-28 | 南京三宝弘正视觉科技有限公司 | 一种宠物监控机器人和系统 |
CN108012326A (zh) * | 2017-12-07 | 2018-05-08 | 珠海市微半导体有限公司 | 基于栅格地图的机器人监视宠物的方法及芯片 |
Family Cites Families (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07193742A (ja) * | 1993-12-27 | 1995-07-28 | Hitachi Ltd | ビデオカメラ |
NL1012137C2 (nl) * | 1999-05-25 | 2000-11-28 | Lely Res Holding | Onbemand voertuig dat inzetbaar is in een stal of een weide. |
JP3913186B2 (ja) | 2003-03-28 | 2007-05-09 | 株式会社東芝 | 移動撮影機器 |
US20090021367A1 (en) * | 2007-07-19 | 2009-01-22 | Davies Daniel F | Apparatus, system, and method for tracking animals |
US20100179689A1 (en) * | 2009-01-09 | 2010-07-15 | National Taiwan University Of Science And Technology | Method of teaching robotic system |
US8493407B2 (en) * | 2009-09-03 | 2013-07-23 | Nokia Corporation | Method and apparatus for customizing map presentations based on user interests |
US11249495B2 (en) * | 2012-09-19 | 2022-02-15 | Botsitter, Llc | Method and system for remote monitoring, care and maintenance of animals |
KR102165437B1 (ko) * | 2014-05-02 | 2020-10-14 | 한화디펜스 주식회사 | 이동 로봇의 경로 계획 장치 |
KR101583376B1 (ko) * | 2014-06-05 | 2016-01-08 | 김명환 | 동물 휴대폰 서비스 시스템 및 방법 |
US9475195B2 (en) * | 2014-09-12 | 2016-10-25 | Toyota Jidosha Kabushiki Kaisha | Anticipatory robot navigation |
WO2016065625A1 (en) | 2014-10-31 | 2016-05-06 | SZ DJI Technology Co., Ltd. | Systems and methods for walking pets |
KR20170053351A (ko) * | 2015-11-06 | 2017-05-16 | 삼성전자주식회사 | 청소 로봇 및 그 제어 방법 |
US10409292B2 (en) * | 2015-12-10 | 2019-09-10 | Panasonic Intellectual Property Corporation Of America | Movement control method, autonomous mobile robot, and recording medium storing program |
JP2017114270A (ja) | 2015-12-24 | 2017-06-29 | 株式会社ナカヨ | 特定ビーコン追跡機能を有する無人飛行体および追跡ビーコン発信ユニット |
CN105706951B (zh) | 2016-04-18 | 2019-03-08 | 宁波力芯科信息科技有限公司 | 一种智能宠物项圈及其实现方法 |
CN106125730B (zh) * | 2016-07-10 | 2019-04-30 | 北京工业大学 | 一种基于鼠脑海马空间细胞的机器人导航地图构建方法 |
US10354515B2 (en) * | 2016-07-21 | 2019-07-16 | Vivint, Inc. | Methods and system for providing an alarm trigger bypass |
KR102559745B1 (ko) * | 2016-10-13 | 2023-07-26 | 엘지전자 주식회사 | 공항 로봇 및 그를 포함하는 공항 로봇 시스템 |
CN106577345B (zh) * | 2016-10-27 | 2023-08-29 | 重庆掌中花园科技有限公司 | 一种智能宠物互动系统 |
CN107368079B (zh) * | 2017-08-31 | 2019-09-06 | 珠海市一微半导体有限公司 | 机器人清扫路径的规划方法及芯片 |
US11016491B1 (en) * | 2018-01-26 | 2021-05-25 | X Development Llc | Trajectory planning for mobile robots |
US20190286145A1 (en) * | 2018-03-14 | 2019-09-19 | Omron Adept Technologies, Inc. | Method and Apparatus for Dynamic Obstacle Avoidance by Mobile Robots |
CN108366343B (zh) * | 2018-03-20 | 2019-08-09 | 珠海市一微半导体有限公司 | 机器人智能监视宠物的方法 |
KR102466940B1 (ko) * | 2018-04-05 | 2022-11-14 | 한국전자통신연구원 | 로봇 주행용 위상 지도 생성 장치 및 방법 |
EP3627250B1 (en) * | 2018-09-21 | 2023-12-06 | Tata Consultancy Services Limited | Method and system for free space detection in a cluttered environment |
CN109540142B (zh) * | 2018-11-27 | 2021-04-06 | 达闼科技(北京)有限公司 | 一种机器人定位导航的方法、装置、计算设备 |
CN109946715B (zh) * | 2019-04-09 | 2021-06-25 | 云鲸智能科技(东莞)有限公司 | 探测方法、装置、移动机器人及存储介质 |
KR102302575B1 (ko) * | 2019-07-16 | 2021-09-14 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
CN110703747B (zh) * | 2019-10-09 | 2021-08-03 | 武汉大学 | 一种基于简化广义Voronoi图的机器人自主探索方法 |
KR20210056694A (ko) * | 2019-11-11 | 2021-05-20 | 엘지전자 주식회사 | 충돌을 회피하는 방법 및 이를 구현한 로봇 및 서버 |
CN111006666B (zh) * | 2019-11-21 | 2021-10-29 | 深圳市优必选科技股份有限公司 | 机器人路径规划方法、装置、存储介质和机器人 |
CN111024100B (zh) * | 2019-12-20 | 2021-10-29 | 深圳市优必选科技股份有限公司 | 一种导航地图更新方法、装置、可读存储介质及机器人 |
US11822340B2 (en) * | 2020-03-06 | 2023-11-21 | Edda Technology, Inc. | Method and system for obstacle avoidance in robot path planning using depth sensors |
US11454974B2 (en) * | 2020-06-29 | 2022-09-27 | Baidu Usa Llc | Method, apparatus, device, and storage medium for controlling guide robot |
US20220206505A1 (en) * | 2020-12-30 | 2022-06-30 | Southeast University | Geometric folding full coverage path for robot and method for generating same |
-
2017
- 2017-12-07 CN CN201711281586.3A patent/CN108012326B/zh active Active
-
2018
- 2018-07-06 JP JP2020531027A patent/JP7136898B2/ja active Active
- 2018-07-06 US US16/768,697 patent/US11470821B2/en active Active
- 2018-07-06 KR KR1020207019551A patent/KR102320370B1/ko active IP Right Grant
- 2018-07-06 EP EP18886384.9A patent/EP3723423B1/en active Active
- 2018-07-06 WO PCT/CN2018/094744 patent/WO2019109635A1/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040093219A1 (en) * | 2002-11-13 | 2004-05-13 | Ho-Chul Shin | Home robot using home server, and home network system having the same |
CN101278654A (zh) * | 2007-09-26 | 2008-10-08 | 深圳先进技术研究院 | 一种宠物看护机器人系统 |
CN106172059A (zh) * | 2016-08-31 | 2016-12-07 | 长沙长泰机器人有限公司 | 宠物喂养机器人 |
CN106584472A (zh) * | 2016-11-30 | 2017-04-26 | 北京贝虎机器人技术有限公司 | 用于控制自主移动式设备的方法及装置 |
CN106982741A (zh) * | 2017-04-06 | 2017-07-28 | 南京三宝弘正视觉科技有限公司 | 一种宠物监控机器人和系统 |
CN108012326A (zh) * | 2017-12-07 | 2018-05-08 | 珠海市微半导体有限公司 | 基于栅格地图的机器人监视宠物的方法及芯片 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112212863A (zh) * | 2019-07-09 | 2021-01-12 | 苏州科瓴精密机械科技有限公司 | 栅格地图的创建方法及创建系统 |
CN112947435A (zh) * | 2021-02-04 | 2021-06-11 | 沈阳仪表科学研究院有限公司 | 一种用于爬壁机器人的导航控制方法 |
CN114302326A (zh) * | 2021-12-24 | 2022-04-08 | 珠海优特电力科技股份有限公司 | 定位区域的确定方法、定位方法、装置和定位设备 |
CN114302326B (zh) * | 2021-12-24 | 2023-05-23 | 珠海优特电力科技股份有限公司 | 定位区域的确定方法、定位方法、装置和定位设备 |
Also Published As
Publication number | Publication date |
---|---|
EP3723423A4 (en) | 2021-08-25 |
EP3723423B1 (en) | 2023-09-06 |
KR20200096606A (ko) | 2020-08-12 |
EP3723423A1 (en) | 2020-10-14 |
JP2021505150A (ja) | 2021-02-18 |
KR102320370B1 (ko) | 2021-11-02 |
US20210169049A1 (en) | 2021-06-10 |
JP7136898B2 (ja) | 2022-09-13 |
CN108012326B (zh) | 2019-06-11 |
US11470821B2 (en) | 2022-10-18 |
EP3723423C0 (en) | 2023-09-06 |
CN108012326A (zh) | 2018-05-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019109635A1 (zh) | 基于栅格地图的机器人监视宠物的方法及芯片 | |
WO2019179001A1 (zh) | 机器人智能监视宠物的方法 | |
CN105115497B (zh) | 一种可靠的室内移动机器人精确导航定位系统及方法 | |
US20200037498A1 (en) | Moving robot, method for controlling moving robot, and moving robot system | |
US20070271011A1 (en) | Indoor map building apparatus, method, and medium for mobile robot | |
US20220161430A1 (en) | Recharging Control Method of Desktop Robot | |
WO2019037668A1 (zh) | 自移动机器人及其行走方法、显示障碍物分布的方法 | |
CN112214015A (zh) | 自移动机器人及其回充的方法、系统及计算机存储介质 | |
US11465275B2 (en) | Mobile robot and method of controlling the same and mobile robot system | |
KR20200015880A (ko) | 스테이션 장치 및 이동 로봇 시스템 | |
CN103472434B (zh) | 一种机器人声音定位方法 | |
WO2018228254A1 (zh) | 一种移动电子设备以及该移动电子设备中的方法 | |
CN107356902B (zh) | 一种WiFi定位指纹数据自动采集方法 | |
Cho et al. | Localization of a high-speed mobile robot using global features | |
JP6699034B2 (ja) | 自律移動ロボット | |
CN110231627A (zh) | 基于可见光定位的服务机器人运行路径计算方法 | |
CN104848852B (zh) | 一种环形传感阵列的定位系统和方法 | |
CN204649206U (zh) | 一种环形传感阵列的定位系统 | |
KR101339899B1 (ko) | 스마트폰 기반 로봇위치 자가 측위 방법 | |
Chen et al. | Multi-Mobile Robot Localization and Navigation based on Visible Light Positioning | |
KR102100478B1 (ko) | 복수의 자율주행 이동 로봇 | |
Saim et al. | A localization approach in a distributed multiagent environment | |
TW201444515A (zh) | 清掃機器人及清掃機器人的定位方法 | |
KR20220144477A (ko) | 실내 자동 이동 시스템 및 이를 이용한 서빙 로봇 | |
CN108459591A (zh) | 自主移动设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18886384 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020531027 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20207019551 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018886384 Country of ref document: EP Effective date: 20200707 |