CN116974275A - Creating virtual boundaries for robotic garden tools - Google Patents

Creating virtual boundaries for robotic garden tools Download PDF

Info

Publication number
CN116974275A
CN116974275A CN202310444612.9A CN202310444612A CN116974275A CN 116974275 A CN116974275 A CN 116974275A CN 202310444612 A CN202310444612 A CN 202310444612A CN 116974275 A CN116974275 A CN 116974275A
Authority
CN
China
Prior art keywords
garden tool
target
electronic processor
robotic
robotic garden
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310444612.9A
Other languages
Chinese (zh)
Inventor
D·G·福特
李希文
黎学深
蔡文浩
吴灏林
李承轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Techtronic Cordless GP
Original Assignee
Techtronic Cordless GP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Techtronic Cordless GP filed Critical Techtronic Cordless GP
Publication of CN116974275A publication Critical patent/CN116974275A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

A communication system may include a robotic garden tool having an electronic processor that may be configured to (i) determine a plurality of relative distances between the robotic garden tool and a target as the target moves in a work area and (ii) determine one or more positions of the robotic garden tool as the target moves in the work area. The electronic processor may be further configured to determine a respective location of the one or more locations of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The virtual boundary may be generated using each relative distance in combination with a respective location of the robotic garden tool at a respective time that allows data determining each relative distance to be captured.

Description

Creating virtual boundaries for robotic garden tools
Cross Reference to Related Applications
The present application claims priority from U.S. provisional application number 63/370,628 (attorney docket number 206737-9054-US 02) filed on month 5 2022 and U.S. provisional application number 63/335,944 (attorney docket number 206737-9054-US 01) filed on month 4 2022, the entire contents of each of which are incorporated herein by reference.
Technical Field
The present disclosure relates to robotic garden tools, and in particular, to methods and systems for creating one or more virtual boundaries for robotic garden tools within a work area.
Disclosure of Invention
One embodiment includes a communication system that may include a robotic garden tool. The robotic garden tool may include a housing and a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in a work area. The robotic garden tool may also include at least one wheel motor coupled to one or more wheels of the set of wheels. The at least one wheel motor may be configured to drive rotation of one or more wheels. The robotic garden tool may also include an electronic processor that may be configured to determine a plurality of relative distances between the robotic garden tool and the target as the target moves in the work area. The electronic processor may be further configured to determine one or more positions of the robotic garden tool as the target moves in the work area. The electronic processor may be further configured to determine a respective location of the one or more locations of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The virtual boundary may be generated using each of the plurality of relative distances in combination with a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The electronic processor may be further configured to control the robotic garden tool to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
In addition to any combination of the above features, the electronic processor may be configured to determine a plurality of waypoints. Each of the plurality of waypoints may be based on a respective one of the plurality of relative distances and a respective location of the robotic garden tool at a respective time that allows data determining the respective one of the plurality of relative distances to be captured. The electronic processor may also be configured to generate the virtual boundary using the waypoints.
In addition to any combination of the above features, the communication system may include a network interface configured to allow the electronic processor to communicate with a base station device. The base station device may be configured to receive a position signal from a satellite and transmit calibration information regarding the position signal to the robotic garden tool. The electronic processor may be configured to receive the position signal from the satellite, receive the calibration information from the base station device, and determine one or more positions of the robotic garden tool based on: (i) the position signal; and (ii) the calibration information.
In addition to any combination of the above features, the electronic processor may be configured to receive the position signal via a first real-time kinematic global navigation satellite system (RTK GNSS) receiver of the robotic garden tool. The electronic processor may be configured to receive the calibration information via a first radio frequency transceiver of the robotic garden tool. The base station apparatus may be configured to receive the position signal via a second RTK GNSS receiver of the base station apparatus. The base station device may be further configured to transmit the calibration information via a second radio frequency transceiver of the base station device.
In addition to any combination of the above features, the electronic processor may be configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves towards the target when the target moves in the work area.
In addition to any combination of the above features, the electronic processor may be configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves towards the target when the target moves in the work area, by at least one of: (i) Determining a Received Signal Strength Indication (RSSI) of a signal output by the target, and controlling movement of the robotic garden tool such that the RSSI of the signal output by the target is equal to or higher than a predetermined RSSI threshold; (ii) Determining that a relative distance of the plurality of relative distances is greater than or equal to a predetermined threshold, and controlling movement of the robotic garden tool to move towards the target until the relative distance between the robotic garden tool and the target decreases below the predetermined threshold; and (iii) receiving a command from the target, wherein the command includes instructions on how to control the robotic garden tool to move the robotic garden tool towards the target.
In addition to any combination of the above features, the electronic processor may be configured to time stamp each of the plurality of relative distances with a respective time corresponding to a time at which data that allows the determination of the relative distance was captured. The electronic processor may be further configured to time stamp each of the one or more locations with a second respective time corresponding to the time at which the location was determined. The electronic processor may be further configured to transmit the plurality of relative distances, the one or more locations, and a respective timestamp for each of the plurality of relative distances and the one or more locations to a remote device. The remote device may be configured to generate the virtual boundary using the plurality of relative distances, the one or more locations, and respective time stamps for each of the plurality of relative distances and the one or more locations. The electronic processor may also be configured to receive the virtual boundary from the remote device.
In addition to any combination of the above features, the electronic processor may be configured to receive a plurality of images captured by a camera as the target moves in the work area. Each image of the plurality of images may include the target. The electronic processor may be further configured to determine each of the plurality of relative distances based on the position and orientation of the target in a respective image of the plurality of images.
In addition to any combination of the above features, the camera may be integrated into the housing of the robotic garden tool.
In addition to any combination of the above features, the camera may be integrated into an external device, and the robotic garden tool may comprise a securing device for securing the external device to the robotic garden tool. The robotic garden tool may be configured to receive the plurality of images from the external device.
In addition to any combination of the above features, the target may include a human user, and the electronic processor may be configured to identify the target within each of the plurality of images using image analysis techniques based on an expected shape of the human user.
In addition to any combination of the features described above, the target may include a fiducial marker, and the electronic processor may be configured to identify the target within each of the plurality of images using an image analysis technique based on an intended design of the fiducial marker.
In addition to any combination of the above features, the communication system may include a millimeter wave radar device. The electronic processor may be configured to receive a plurality of data samples captured by the millimeter wave radar device as the target moves in the work area. Each of the plurality of data samples may include data indicative of a respective location of the target. The electronic processor may be configured to determine each of the plurality of relative distances based on a respective location of the target in each of the plurality of data samples.
In addition to any combination of the above features, the target may include a human user, and the electronic processor, the millimeter wave radar device, or both the electronic processor and the millimeter wave radar device may be configured to identify the target within each of the plurality of data samples based on an expected shape of the human user.
In addition to any combination of the above features, the communication system may include a server device configured to receive the plurality of relative distances and a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The server device may be configured to generate the virtual boundary using each of the plurality of relative distances in combination with a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured.
Another embodiment includes a method of creating a virtual boundary. The method may include determining, with an electronic processor of the robotic garden tool, a plurality of relative distances between the robotic garden tool and the target as the target moves in the work area. The robotic garden tool may include a housing and a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in the work area. The robotic garden tool may further comprise at least one wheel motor coupled to one or more wheels of the set of wheels, the at least one wheel motor configured to drive rotation of the one or more wheels. The method may further include determining, with the electronic processor, one or more positions of the robotic garden tool as the target moves in the work area. The method may further include determining, with the electronic processor, a respective location of the one or more locations of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The method may further include generating the virtual boundary using each of the plurality of relative distances in combination with a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured. The method may further include controlling, with the electronic processor, the robotic garden tool to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
In addition to any combination of the above features, the method may further include receiving, with the electronic processor, a plurality of images captured by a camera as the target moves in the work area. Each image of the plurality of images may include the target. The method may also include determining, with the electronic processor, each of the plurality of relative distances based on: (i) The position of the target in a respective image of the plurality of images; and (ii) a respective location of the robotic garden tool at a respective time of capturing the respective image.
In addition to any combination of the above features, the camera may be integrated into the housing of the robotic garden tool. In addition to any combination of the above features, the camera may be integrated into an external device, and the robotic garden tool may comprise a securing device for securing the external device to the robotic garden tool. In some examples, receiving the plurality of images includes receiving the plurality of images from the external device with the electronic processor via a network interface of the robotic garden tool.
In addition to any combination of the above features, the target may include a fiducial marker, and the method may further include identifying, with the electronic processor, the target within each of the plurality of images using image analysis techniques based on an expected design of the fiducial marker.
In addition to any combination of the above features, the method may further include receiving, with the electronic processor, a plurality of data samples captured by a millimeter wave radar device as the target moves in the work area. Each of the plurality of data samples may include data indicative of a respective location of the target. The method may also include determining, with the electronic processor, each of the plurality of relative distances based on: (i) A respective location of the target in each of the plurality of data samples; and (ii) a respective location of the robotic garden tool at a respective time of capturing each data sample. In some examples, the target comprises a human user, and the method further comprises identifying the target within each of the plurality of data samples based on an expected shape of the human user using the electronic processor, the millimeter wave radar device, or both the electronic processor and the millimeter wave radar device.
In addition to any combination of the above features, generating the virtual boundary may include generating the virtual boundary with a server device remote from the robotic garden tool.
Another embodiment includes a communication system that may include a robotic garden tool that may include a housing, a fiducial marker, and a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in a work area. The robotic garden tool may also include at least one wheel motor coupled to one or more wheels of the set of wheels. The at least one wheel motor may be configured to drive rotation of one or more wheels. The robotic garden tool may also include a first electronic processor, which may be configured to determine a position of the robotic garden tool. The communication system may also include an external device, which may include a data capture device (e.g., a camera, an object detection device, etc.). The external device may also include a second electronic processor that may be configured to control the data capture device to capture a plurality of data samples (e.g., images) as the external device moves in the work area. Each of the plurality of data samples may include data indicative of a relative position of the robotic garden tool with respect to the external device (e.g., each image may include a fiducial marker of the robotic garden tool to allow for determining the relative distance). The plurality of locations of the external device may be determined using the plurality of images based on: (i) The position of the fiducial marker in a respective image of the plurality of images; and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding image was captured. Each of a plurality of locations of the external device may be stored as a waypoint. These path points may be used to generate virtual boundaries. The robotic garden tool may be configured to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
In addition to any combination of the above features, the communication system may further comprise a base station device, which may be configured to communicate with the robotic garden tool. The base station device may be configured to receive a position signal from a satellite and transmit calibration information regarding the position signal to the robotic garden tool. The first electronic processor of the robotic garden tool may be configured to receive the position signal from the satellite, receive the calibration information from the base station device, and determine the current position of the robotic garden tool based on: (i) the position signal; and (ii) the calibration information.
In addition to any combination of the above features, the first electronic processor may be configured to receive the position signal via a first real-time kinematic global navigation satellite system (RTK GNSS) receiver of the robotic garden tool. The first electronic processor may be further configured to receive the calibration information via a first radio frequency transceiver of the robotic garden tool. The base station apparatus may be configured to receive the position signal via a second RTK GNSS receiver of the base station apparatus. The base station device may be further configured to transmit the calibration information via a second radio frequency transceiver of the base station device.
In addition to any combination of the above features, the first electronic processor of the robotic garden tool may be configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves towards the external device when the external device moves in the work area.
In addition to any combination of the above features, the first electronic processor of the robotic garden tool may be configured to control the operation of the at least one wheel motor to control the movement of the robotic garden tool by at least one of: (i) receiving a current location of the external device from the external device and controlling the robotic garden tool to move towards the current location of the external device, wherein the external device comprises a first Global Positioning System (GPS) receiver configured to receive data to be used for determining the current location of the external device, (ii) determining a Received Signal Strength Indication (RSSI) of a signal output by the external device and controlling the movement of the robotic garden tool such that the RSSI of the signal output by the external device is equal to or higher than a predetermined RSSI threshold, and (iii) receiving a command from the external device, wherein the command comprises instructions on how to control the robotic garden tool to move towards the external device.
In addition to any combination of the above features, the robotic garden tool may comprise a first Global Positioning System (GPS) receiver, and the first electronic processor may be configured to determine the position of the robotic garden tool, at least in part, using the first GPS receiver. The external device may include a second GPS receiver, and the second electronic processor may be configured to determine a location of the external device using the second GPS receiver. In some examples, the first GPS receiver and the second GPS receiver are different types of GPS receivers such that the first GPS receiver of the robotic garden tool is able to achieve a more accurate positioning determination than the second GPS receiver of the external device.
In addition to any combination of the above features, the second electronic processor may be configured to receive a plurality of locations of the robotic garden tool from the robotic garden tool. Each of the plurality of locations of the robotic garden tool may comprise a respective timestamp. The second electronic processor may be further configured to determine the plurality of locations of the external device based on: (i) The position of the fiducial marker in a respective image of the plurality of images; and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding image was captured. The plurality of positions of the robotic garden tool received from the robotic garden tool may be used to determine the position of the robotic garden tool at a time corresponding to the time when the respective image was captured. The second electronic processor may be further configured to store a location of the plurality of locations of the external device as a waypoint. The second electronic processor may be further configured to generate the virtual boundary using the path points and transmit information indicative of the virtual boundary to the robotic garden tool.
In addition to any combination of the above features, the second electronic processor may be configured to time stamp each of the plurality of images with a time corresponding to a time at which the respective image was captured, and transmit the plurality of images and the respective time stamp to at least one of the robotic garden tool and the remote device. In some examples, at least one of the first electronic processor of the robotic garden tool and the other electronic processor of the remote device is configured to determine a plurality of locations of the external device, store the locations of the plurality of locations of the external device as waypoints, and generate the virtual boundary using the waypoints.
Another embodiment includes a method of creating a virtual boundary. The method may include determining, with a first electronic processor of the robotic garden tool, a position of the robotic garden tool. The robotic garden tool may include a housing, a fiducial marker, and a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in a work area. The robotic garden tool may also include at least one wheel motor coupled to one or more wheels of the set of wheels. The at least one wheel motor may be configured to drive rotation of one or more wheels. The method may further include controlling a data capture device (e.g., camera, object detection device, etc.) of the external device with a second electronic processor of the external device to capture a plurality of data samples (e.g., images) as the external device moves in the work area. Each image of the plurality of images may include data indicative of a relative position of the robotic garden tool with respect to the external device (e.g., each image may include a fiducial marker of the robotic garden tool to allow for determining the relative distance). The method may also include determining a plurality of locations of the external device based on: (i) The position of the fiducial marker in a respective image of the plurality of images; and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding image was captured. The method may further include storing each of the plurality of locations of the external device as a waypoint. The method may also include generating the virtual boundary using the path points. The robotic garden tool may be configured to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
In addition to any combination of the above features, the method may include receiving a position signal from a satellite with a base station device. The method may further comprise transmitting calibration information regarding the position signal to the robotic garden tool using the base station device. The method may also include receiving, with the first electronic processor of the robotic garden tool, the position signal from the satellite. The method may further include receiving, with the first electronic processor of the robotic garden tool, the calibration information from the base station device. The method may further include determining, with the first electronic processor of the robotic garden tool, a current position of the robotic garden tool based on: (i) the position signal; and (ii) the calibration information.
In addition to any combination of the above features, the method may further comprise controlling operation of the at least one wheel motor with the first electronic processor of the robotic garden tool to control movement of the robotic garden tool such that the robotic garden tool moves towards the external device when the external device moves in the work area.
In addition to any combination of the above features, in some examples, controlling operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves toward the external device when the external device moves in the work area comprises at least one of: (i) receiving a current location of the external device from the external device with the first electronic processor and controlling movement of the robotic garden tool towards the current location of the external device with the first electronic processor, wherein the external device comprises a first Global Positioning System (GPS) receiver configured to receive data to be used to determine the current location of the external device, (ii) determining a Received Signal Strength Indication (RSSI) of a signal output by the external device with the first electronic processor and controlling movement of the robotic garden tool with the first electronic processor such that the RSSI of the signal output by the external device is equal to or higher than a predetermined RSSI threshold, and (iii) receiving a command from the external device with the first electronic processor, wherein the command comprises instructions on how to control the robotic garden tool to move the robotic garden tool towards the external device.
In addition to any combination of the above features, the method may include receiving a plurality of locations of the robotic garden tool from the robotic garden tool and utilizing the second electronic processor. Each of the plurality of locations of the robotic garden tool may comprise a respective timestamp. The method may also include determining, with the second electronic processor, locations of the external device based on: (i) The position of the fiducial marker in a respective image of the plurality of images; and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding image was captured. The plurality of positions of the robotic garden tool received from the robotic garden tool may be used to determine the position of the robotic garden tool at a time corresponding to the time when the respective image was captured. The method may further include storing, with the second electronic processor, a location of the plurality of locations of the external device as a waypoint. The method may also include generating, with the second electronic processor, the virtual boundary using the waypoints; and transmitting information indicative of the virtual boundary to the robotic garden tool using the second electronic processor.
In addition to any combination of the above features, the method may include time stamping each image of the plurality of images with a time corresponding to a time at which the respective image was captured using the second electronic processor. The method may also include transmitting, with the second electronic processor, the plurality of images and corresponding timestamps to at least one of the robotic garden tool and a remote device. The method may further include determining a plurality of locations of the external device using at least one of the first electronic processor of the robotic garden tool and another electronic processor of the remote device. The method may further include storing a location of the plurality of locations of the external device as a waypoint using at least one of the first electronic processor of the robotic garden tool and another electronic processor of the remote device. The method may further include generating the virtual boundary using the path points with at least one of the first electronic processor of the robotic garden tool and another electronic processor of the remote device.
Another embodiment includes a robotic garden tool that may include a housing, a fiducial marker, and a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in a work area. The robotic garden tool may also include at least one wheel motor coupled to one or more wheels of the set of wheels. The at least one wheel motor may be configured to drive rotation of one or more wheels. The robotic garden tool may further comprise a first electronic processor, which may be configured to control the operation of the at least one wheel motor to control the movement of the robotic garden tool such that the robotic garden tool moves towards the external device when the external device moves in the work area during creation of a virtual boundary. The first electronic processor may be further configured to determine a plurality of positions of the robotic garden tool as the robotic garden tool moves. The first electronic processor may be further configured to time stamp each of a plurality of locations of the robotic garden tool. The plurality of locations of the robotic garden tool may be used in combination with information captured by the external device to generate the virtual boundary. The robotic garden tool may be configured to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
In addition to any combination of the above features, a plurality of data samples (e.g., images) captured by a data capture device (e.g., camera, object detection device, etc.) of the external device may be used and a plurality of locations of the external device determined based on: (i) Data indicative of a relative position of the robotic garden tool with respect to the external device (e.g., a position of the fiducial marker in a respective image of the plurality of images); and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding data sample was captured. Each of a plurality of locations of the external device may be stored as a waypoint. The virtual boundary may be generated using the path points.
In addition to any combination of the above features, the first electronic processor may be configured to transmit locations of the robotic garden tool and associated respective time stamps to the external device. The second electronic processor of the external device may be configured to determine the plurality of locations of the external device based on: (i) The position of the fiducial marker in a respective image of the plurality of images; and (ii) the location of the robotic garden tool at a time corresponding to the time at which the corresponding image was captured. The positions of the robotic garden tool at a time corresponding to the time at which the respective images were captured may be determined using a plurality of positions of the robotic garden tool transmitted by the robotic garden tool. The second electronic processor may be further configured to generate the virtual boundary based on a plurality of locations of the external device. The first electronic processor of the robotic garden tool may be configured to receive information indicative of the virtual boundary from the external device, and control operation of the at least one wheel motor to control movement of the robotic garden tool based at least in part on the virtual boundary.
In addition to any combination of the above features, the first electronic processor may be configured to receive the position signal from the satellite, receive calibration information from the base station device, and determine the current position of the robotic garden tool based on: (i) the position signal; and (ii) the calibration information.
In addition to any combination of the above features, the first electronic processor may be configured to receive the position signal via a first real-time kinematic global navigation satellite system (RTK GNSS) receiver of the robotic garden tool. The first electronic processor may be configured to receive the calibration information via a first radio frequency transceiver of the robotic garden tool.
In addition to any combination of the above features, the first electronic processor may be configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves towards the external device when the external device moves in the work area, by at least one of: (i) Receiving a current location of the external device from the external device and controlling the robotic garden tool to move toward the current location of the external device, wherein the external device comprises a first Global Positioning System (GPS) receiver configured to receive data to be used to determine the current location of the external device, (ii) determining a Received Signal Strength Indication (RSSI) of a signal output by the external device and controlling the movement of the robotic garden tool such that the RSSI of the signal output by the external device is equal to or higher than a predetermined RSSI threshold, and receiving a command from the external device, wherein the command comprises instructions on how to control the robotic garden tool to move toward the external device.
Other aspects of the disclosure will become apparent by consideration of the detailed description and accompanying drawings.
Drawings
Fig. 1A illustrates a communication system including robotic garden tools, according to some example embodiments.
Fig. 1B illustrates an example implementation of the communication system of fig. 1A, according to some example embodiments.
Fig. 1C illustrates a bottom perspective view of the robotic garden tool of fig. 1A, according to some example embodiments.
Fig. 2 is a block diagram of the robotic garden tool of fig. 1A and 1B, according to some example embodiments.
Fig. 3 is a block diagram of the external device of fig. 1A according to some example embodiments.
Fig. 4 is a block diagram of the base station apparatus of fig. 1A according to some example embodiments.
Fig. 5 illustrates a flowchart of a method that may be performed by the robotic garden tool and base station device of fig. 1A to create a virtual boundary for the robotic garden tool, according to some example embodiments.
FIG. 6 illustrates an example use case for creating virtual boundaries according to some example embodiments.
Fig. 7 illustrates a flowchart of another method that may be performed by the robotic garden tool of fig. 1A to create a virtual boundary for the robotic garden tool, according to some example embodiments.
FIG. 8 illustrates another example use case for creating virtual boundaries according to some example embodiments.
Fig. 9 illustrates a perspective view of the robotic garden tool of fig. 1A, and an enlarged view of an interface removably attached to the robotic garden tool, according to some example embodiments.
Fig. 10 illustrates a perspective view of a compartment on a housing of the robotic garden tool of fig. 1A, according to some example embodiments.
Fig. 11A-11D illustrate perspective views of the removable attachment interface of fig. 9, according to some example embodiments.
Fig. 12 illustrates a perspective view of the robotic garden tool of fig. 1A, wherein the interface of fig. 9 comprises a fixture for holding another device, according to some example embodiments.
Detailed Description
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of "including," "comprising," or "having" and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms "mounted," "connected," and "coupled" are used broadly and encompass both direct and indirect mounting, connecting, and coupling. Further, "connected" and "coupled" are not restricted to physical or mechanical connections or couplings, and may include electrical connections or couplings, whether direct or indirect.
It should be noted that the present application may be implemented using a plurality of hardware and software based devices as well as a plurality of different structural components. Furthermore, and as described in subsequent paragraphs, the specific configurations shown in the drawings are intended to exemplify embodiments of the application and that other alternative configurations are possible. Unless otherwise stated, the terms "processor," "central processing unit," and "CPU" are interchangeable. Where the term "processor" or "central processing unit" or "CPU" is used as an element to identify a particular function, it should be understood that those functions may be implemented by a single processor or by a plurality of processors arranged in any form, including parallel processors, serial processors, or cloud processing/cloud computing configurations, unless otherwise stated.
In the present application, the term "approximately" may be used to describe the dimensions of the various components and/or the path of travel of the robotic garden tool. In some cases, the term "approximately" means that the recited dimension is within 1% of the stated value, within 5% of the stated value, or within 10% of the stated value, etc. When the term "and/or" is used in the present application, it is intended to include any combination of the listed components. For example, if a component includes a and/or B, the component may include a alone, B alone, or a and B.
Fig. 1A illustrates a communication system 100 according to some example embodiments, which may include a robotic garden tool 105 (e.g., a robotic lawnmower 105, which may also be referred to as a robotic lawnmower 105), a docking station 110 for the robotic lawnmower 105, an external device 115, a base station device 145, a satellite 150, and a server 152. The robotic garden tool 105 is mainly described as a robotic lawnmower 105. However, in other embodiments, robotic garden tool 105 may include tools for sweeping debris, sucking debris, cleaning debris, collecting debris, moving debris, and the like. The debris may include plants (e.g., grass, leaves, flowers, stems, weeds, twigs, branches, etc., and cuts thereof), dust, dirt, site debris, snow, etc. For example, other embodiments of robotic garden tool 105 may include a vacuum cleaner, a trimmer, a string trimmer, a hedge trimmer, a sweeper, a cutter, a plow, a blower, a snow blower, and the like.
In some embodiments, the lawn may include any type of property, including grass, crops, some other material to be trimmed, cleaned, collected, etc., and/or including some material to be treated by robotic garden tools (e.g., fertilizer for treating grass on the lawn). In some embodiments, for example, when robotic garden tools are used to shovel snow/remove, the lawn may include a paved portion of the property (e.g., a roadway).
In some embodiments, the docking station 110 may be installed in a yard/lawn using stakes 120. The robotic lawnmower 105 may be configured to mow in a yard and to dock at the docking station 110 to charge a battery 245 (see fig. 2) of the robotic lawnmower 105. In some embodiments, docking station 110 is configured to electrically connect with a power source (e.g., via wires and plugs connected to a wall outlet that is connected to a power grid) to provide a charging current to robotic lawnmower 105 when robotic lawnmower 105 is electrically coupled with docking station 110.
In some embodiments, the docking station 110 may also be electrically connected to a border cable (i.e., border wire). In some embodiments, the docking station 110 provides power to the border cable to control the border cable to provide/transmit electromagnetic signals that may be detected, for example, by the robotic lawnmower 105. In some embodiments, the border cable may be any cable, wire, etc. configured to transmit a signal and configured to be mounted on a work surface (e.g., a yard including grass) in a discontinuous and unobtrusive manner (e.g., secured to the bottom of grass blades, against the ground/soil where grass is growing, to prevent the robotic lawnmower 105 and other persons or objects from being physically blocked by the border cable). For example, a plurality of pegs/stakes may be used to secure the border cable to the ground/soil. As another example, the border cable may be buried in the ground/soil beneath the grass (e.g., if the border cable is installed while a piece of land is being developed). In some embodiments, in response to detecting the electromagnetic signal from the border cable, the robotic lawnmower 105 is configured to control its movement such that the robotic lawnmower 105 remains within the border defined by the border cable. For example, in response to detecting a border cable, the robotic lawnmower 105 may be configured to stop moving forward and turn in a random direction to begin traveling in an approximately straight line until the robotic lawnmower 105 again detects the border cable.
In some embodiments, robotic lawnmower 105 operates without a boundary cable. Rather, the robotic lawnmower 105 may include mapping capabilities, positioning tracking capabilities, and the like, that allow the robotic lawnmower 105 to remain within a predefined boundary (e.g., virtual boundary) without using a boundary cable. In some embodiments, robotic lawnmower 105 may determine its location (and/or may help allow base station device 145 and/or external device 115 to determine its respective location) by communicating with other devices, such as base station device 145 and/or satellite 150, as described in detail below. For example, the robotic lawnmower 105 and the base station device 145 may use a Radio Frequency (RF) communication protocol (e.g., wiFi TM 、Bluetooth TM 、Bluetooth TM Low power consumption (BLE), etc.) are in communication with each other. The creation/generation of virtual boundaries according to some example embodiments is also described in detail below.
In some embodiments, docking station 110 includes a docking cable loop, a magnet configured to be sensed by a magnetic sensor of robotic lawnmower 105, and/or another transmitting device configured to transmit a docking signal that may be detected by robotic lawnmower 105. For example, the dock signal may indicate that the robotic lawnmower 105 is near the docking station 110 and may allow the robotic lawnmower 105 to take some action in response thereto, such as to dock the robotic lawnmower 105 at the docking station 110.
As indicated in fig. 1A, in some embodiments, robotic lawnmower 105 is configured to wirelessly communicate (e.g., via Bluetooth) with external device 115 and/or base station device 145 when robotic lawnmower 105 is within communication range of external device 115 and/or base station device 145 TM 、WiFi TM Etc.). The external device 115 may be, for example, a smart phone (as shown), a laptop computer, a tablet computer, a Personal Digital Assistant (PDA), a wireless communication router that allows another external device 115 remote from the robotic lawnmower 105 to communicate with the robotic lawnmower 105, or another electronic device capable of communicating with the robotic lawnmower 105. The external device 115 may generate a user interface and allow a user to access and interact with information of the robotic lawnmower. The external device 115 may receive user inputs to determine operating parameters/instructions of the robotic lawnmower 105, enable or disable features of the robotic lawnmower 105, and so forth. In some embodiments, the communication between the external device 115 and the robotic lawnmower 105 may be wired (e.g., via Universal Serial Bus (USB) wires configured to connect to respective USB ports of the external device 115 and the robotic lawnmower 105).
In some embodiments, base station device 145 is considered external device 115. The base station device 145 may be placed in a stationary manner at a base station location to assist the robotic lawnmower 105 in determining the current position of the robotic lawnmower 105 as the robotic lawnmower 105 moves within the work area, as described in more detail below. For example, the base station apparatus 145 may be placed on the roof of a building adjacent to the work area 155 where the robotic lawnmower 105 performs tasks (see fig. 1B). As other examples, the base station device 145 may be located at a different location on a building or at a location within or near the work area 155 (e.g., at the same location as the charging station 110, on a pole/stake inserted into the ground within or near the work area 155, etc.). While base station device 145 may be configured to remain stationary during operation of robotic lawnmower 105 within work area 155, in some embodiments base station device 145 may be removed from the base station location to define or revise the virtual boundary, change the base station location when robotic lawnmower 105 is not in operation, and the like.
As indicated in fig. 1A and 1B, in some embodiments, robotic lawnmower 105, external device 115, and/or base station device 145 are configured to communicate wirelessly and bi-directionally with each other and/or with one or more satellites 150 and/or one or more servers 152. For example, the robotic lawnmower 105, the external device 115, and/or the base station device 145 may include a Global Positioning System (GPS) receiver configured to communicate with one or more satellites 150 to determine the location of the respective robotic lawnmower 105, external device 115, and/or base station device 145. As another example, robotic lawnmower 105, external device 115, and/or base station device 145 may transmit information to and/or receive information from server 152, e.g., over a cellular network. (i) Additional details of the communication between the robotic lawnmower 105, the external device 115, and/or the base station device 145, and (ii) the one or more satellites 150 and/or the one or more servers 152 will be described below. Although fig. 1A illustrates one satellite 150 and one server 152, in some embodiments, the communication system 100 includes additional satellites 150 and/or servers 152. In some embodiments, communication system 100 may not include any server 152.
As shown in fig. 1A, in some embodiments, robotic lawnmower 105 includes fiducial markers 160 (e.g., binary square fiducial markers, such as the university of kordoow augmented reality (arucco) markers). Fiducial marker 160 may be a removable marker placed on a predetermined area of outer housing 125A to be visible outside of robotic lawnmower 105. Fiducial markers 160 may be removed from robotic lawnmower 105 when boundary creation is not being made and may be placed on robotic lawnmower 105 when boundary creation is being made. For example, outer housing 125 may include a predetermined area as indicated by a notch or label indicating where fiducial marker 160 should be placed on outer housing 125A. In some embodiments, fiducial marker 160 may be configured to be included in a plurality of images captured by camera 330 of external device 115 to create a virtual boundary, as explained in detail below. For example, the position and orientation of the fiducial markers 160 in each captured image may allow for camera pose estimation to determine the position of the external device 115 at the time the corresponding image was captured.
Fig. 1C illustrates a bottom perspective view of robotic lawnmower 105 according to some example embodiments. The robotic lawnmower 105 may include a housing 125, which may include an outer housing 125A (i.e., an outer housing) and an inner housing 125B. The outer housing 125A may be coupled to the inner housing 125B. The robotic lawnmower 105 may also include wheels 130 (i.e., a set of wheels 130) coupled to the inner housing 125B and configured to rotate relative to the housing 125 to propel the robotic lawnmower 105 over a work surface (e.g., a yard to be mowed). The wheels 130 may include a motorized drive wheel 130A and a non-motorized drive wheel 130B. In the embodiment shown in fig. 1B, the two rear wheels 130A are motor driven wheels 130A and the two front wheels 130B are non-motor driven wheels 130B. In other embodiments, robotic lawnmower 105 may include different wheel arrangements (e.g., different total number of wheels, different number of wheels of each type, different motor drive wheels or non-motor drive wheels, etc.). In some embodiments, the housing 125 may not include an outer housing 125A and an inner housing 125B. Rather, the housing 125 may comprise a single integrated body/housing with the wheels 130 attached.
In some embodiments, robotic lawnmower 105 includes a wheel motor 235 (see fig. 2) coupled to one or more wheels 130 and configured to drive rotation of one or more wheels 130. In some embodiments, the robotic lawnmower 105 includes a plurality of wheel motors 235, wherein each wheel motor 235 is configured to drive rotation of a respective motor drive wheel 130A (see fig. 2).
In some embodiments, robotic lawnmower 105 includes a cutting blade assembly 135 coupled to inner housing 125B and configured to rotate relative to housing 125 to cut grass on a work surface. The cutting blade assembly 135 may include a rotating disk to which a plurality of cutting blades 140 configured to cut grass are attached. In some embodiments, robotic lawnmower 105 includes a cutting blade assembly motor 240 (see fig. 2) coupled to inner housing 125B and cutting blade assembly 135. The cutting blade assembly motor 240 may be configured to drive rotation of the cutting blade assembly 135 to cut grass on a work surface.
In some embodiments, robotic lawnmower 105 and/or docking station 110 include more components and functions than shown and described herein.
Fig. 2 is a block diagram of robotic lawnmower 105 according to some example embodiments. In the illustrated embodiment, the robotic lawnmower 105 includes a first electronic processor 205 (e.g., a microprocessor or other electronic device). The first electronic processor 205 includes an input interface and an output interface (not shown) and is electrically coupled to a first memory 210, a first network interface 215, an optional first input device 220, an optional display 225, one or more sensors 230, a left rear wheel motor 235A, a right rear wheel motor 235B, a cutter blade assembly motor 240, and a battery 245. In some embodiments, robotic lawnmower 105 includes fewer or more components in a different configuration than that shown in fig. 2. For example, the robotic lawnmower 105 may not include the first input device 220 and/or the first display 225. As another example, the robotic lawnmower 105 may include a height adjustment motor configured to adjust the height of the cutting blade assembly 135. As yet another example, robotic lawnmower 105 may include more or fewer sensors than sensor 230 described herein. As yet another example, the robotic lawnmower 105 may include a camera 250 (e.g., similar to the camera 330 described below with respect to fig. 3), and/or may include a removable attachment interface 905 for securing the camera, sensors (e.g., millimeter wave radar devices), and/or devices with cameras (e.g., the external device 115) to the housing 125 of the robotic lawnmower 105 (see fig. 9-12). In some embodiments, robotic lawnmower 105 performs functions other than those described below.
The first memory 210 may include Read Only Memory (ROM), random Access Memory (RAM), other non-transitory computer readable media, or a combination thereof. The first electronic processor 205 is configured to receive instructions and data from the first memory 210 and execute instructions and the like. Specifically, the first electronic processor 205 executes instructions stored in the first memory 210 to perform the methods described herein.
The first network interface 215 is configured to transmit data to and receive data from other devices in the communication system 100 (e.g., the external device 115, the base station device 145, the satellite 150, and/or the server 152). In some embodiments, the first network interface 215 includes one or more transceivers (e.g., configured to communicate via Bluetooth) for wireless communication with the external device 115, the docking station 110, and/or the base station device 145 TM 、WiFi TM A first RF transceiver that communicates, etc.). The first network interface 215 may include additional transceivers for wireless communication with the server 152 via, for example, cellular communication. The first network interface 215 may also include a first GPS receiver (e.g., a first real-time kinematic global navigation satellite system (RTK GNSS) receiver) configured to receive position signals from one or more satellites 150. In some embodiments, at least some of the transceivers and/or receivers of robotic lawnmower 105 may combine or share some elements (e.g., antennas and/or other hardware). Alternatively or additionally, the first network interface 215 may include a connector or port for receiving a wired connection (such as a USB cable) with the external device 115. In some embodiments, one or more transceivers of the first network interface 215 may act as sensors and may be configured to receive wireless signals from the external device 115. The wireless signal from the external device 115 may be processed by the first electronic processor 205 to determine the strength of the wireless signal (e.g., a Received Signal Strength Indicator (RSSI)). For example, a portion of the first network interface 215 that serves as a sensor may include a wireless communication receiver (e.g., a radio frequency transceiver), and the wireless signals received by the sensor may include beacon signals of a common data communication protocol, including Wi-Fi TM Signal and Bluetooth TM Signals (e.g. Bluetooth TM Low power consumption (BLE) signal).
The first user input device 220 is configured to allow the first electronic processor 205 to receive user input from a user, for example, to set/adjust operating parameters of the robotic lawnmower 105. The first display 225 is configured to display a user interface to a user. Similar to the user interface of the external device 115 previously described herein, the user interface displayed on the first display 225 may allow a user to access and interact with robotic lawnmower information. In some embodiments, the first display 225 may also serve as the first input device 220. For example, a touch-sensitive input interface may be incorporated into the first display 225 to allow a user to interact with content provided on the first display 225. The first display 225 may be a Liquid Crystal Display (LCD) screen, an Organic Light Emitting Display (OLED) display screen, or an electronic ink display. In some embodiments, the first display 225 includes future developed display technologies.
In some embodiments, the first electronic processor 205 communicates with a plurality of sensors 230, which may include electromagnetic field sensors, radio frequency sensors (e.g., radio Frequency Identification (RFID) interrogator/sensors), hall sensors, other magnetic sensors, transceivers/receivers of the first network interface 215, and so forth.
In some embodiments, the sensor 230 includes one or more object detection devices 255 (i.e., object detection sensors 255). The one or more target detection devices 255 may use a first positioning technique to determine the relative positioning of one or more detected targets with respect to the position/location of the robot 105 at one or more different times (e.g., as explained below in the examples regarding millimeter wave radar devices/sensors). The object detection sensor 255 may include a millimeter wave radar device/sensor. Millimeter wave radar devices may transmit millimeter waves and receive echoes of the millimeter waves from targets (e.g., users walking around the boundary during virtual boundary creation, obstacles in a yard, etc.). The use of millimeter wave radar devices on robotic lawnmower 105 may be particularly advantageous because millimeter waves may be able to penetrate most objects that robotic lawnmower 105 may encounter, such as grass, rain, plastic, and the like. Thus, the millimeter wave radar device may detect objects located behind other objects to determine a more complete scene of the object within the detection angle range/detection area of the millimeter wave radar device.
In some examples, the millimeter wave radar device determines data regarding each point (e.g., a three-dimensional point in an x-y-z coordinate system) within a detection area/space of the millimeter wave radar device. For example, for each point, the millimeter wave radar device may determine a point identification, its respective x, y, and z coordinates, a speed of a target located at the point in each direction x, y, and z relative to robotic lawnmower 105, a signal strength of an echo signal in contact with the target located at the point. In some examples, millimeter wave radar device and/or first electronic processor 205 includes a built-in algorithm configured to group a plurality of adjacent points (i.e., clusters of data points) into a single target/obstacle.
In some examples, millimeter wave radar device provides target detection data (e.g., processed data samples) to first electronic processor 205, where the target detection data indicates information about each target within a detection region of the millimeter wave radar device (e.g., a size and/or shape of each target (which is based on a number of clusters/adjacent data points corresponding to the target), a location of each target (including x-y coordinates of each target, x-y-z coordinates of each target), and so forth. In other words, the target detection data may indicate a respective positioning of each of the one or more targets relative to the robotic garden tool 105 (i.e., a relative distance between the target and the robotic garden tool 105). As indicated by the above explanation, the first electronic processor 205 may determine a respective distance between the robotic lawnmower 105 and each target within the detection area of the millimeter wave radar device based on target detection data received from the millimeter wave radar device. The object detection data may also indicate a respective size and/or shape of each of the one or more objects based on the number of data points in the cluster that make up each object. For example, the more data points detected within a cluster of data points representing a target, the larger the target is determined by the first electronic processor 205. In some examples, first electronic processor 205, millimeter wave radar device, or both first electronic processor 205 and millimeter wave radar device are configured to identify an object (e.g., human user 605 moving along the boundary during the virtual boundary creation process) within each of a plurality of data samples captured by millimeter wave radar device based on an expected shape of the object (e.g., an expected shape of human user 605).
While some of the explanations above for millimeter wave radar devices indicate that millimeter wave radar devices provide processed target detection data (i.e., processed data samples) to first electronic processor 205, in some examples, millimeter wave radar devices provide raw data samples to first electronic processor 205 and first electronic processor 205 processes the raw data to identify targets, for example, by grouping a plurality of adjacent points (i.e., clusters of data points) into a single target/obstacle. In some examples, the object detection device 255 additionally or alternatively includes other types of object detection devices 255. For example, the object detection device 255 may be a laser rangefinder or other range finding device, the data of which may be used to determine the relative distance between the robotic lawnmower 105 and the object.
In examples where robotic lawnmower 105 includes camera 250, the camera may be built into housing 125. For example, the camera 250 may be located on top of the housing 125A and/or in front of the housing 125A. The cameras 250 may include multiple cameras 250 to provide 360 degree coverage around the robotic lawnmower 105. Camera 250 may be a depth of field (DoF) camera. In some examples, camera 250 is a 360 degree camera. In some examples, camera 250 may be considered as sensor 230.
In some examples (e.g., examples where the robotic lawnmower 105 does not include the integrated camera 250), the robotic lawnmower 105 may include a removable attachment interface 905 for securing the camera, the sensor (e.g., millimeter wave radar device), and/or the camera-bearing device (e.g., the external device 115) to the housing 125 of the robotic lawnmower 105 (see fig. 9-12). For example, the interface 905 includes an interface housing 1105 removably attached (e.g., using screws) to a top surface of the housing 125 of the robotic lawnmower 105.
As indicated in the example shown in fig. 10, the top surface of the housing 125 may include a compartment 1005 configured to receive a bottom portion of the interface housing 1105, while a top portion of the interface housing 1105 protrudes upward from the compartment 1005 (see fig. 9). The compartment 1005 may include a stud 1010 configured to receive a screw to secure the interface 905 in the compartment 1005. The compartment 1005 may also include a first projection/recess 1015 configured to engage with a second projection/recess 1110 of the bottom surface of the interface 905 to ensure that the interface 905 is properly installed in the compartment 1005 (e.g., facing in a desired direction). The compartment 1005 may also include through holes (not shown) (e.g., on a bottom surface thereof) to allow wires and/or connectors from the robotic lawnmower 105 to connect to one or more components within the interface 905 when the interface 905 is installed in the compartment 1005.
Fig. 11A-11D illustrate examples of removable attachment interfaces 905 according to some example embodiments. The interface 905 may include an interface housing 1105 to house, for example, a camera, a sensor (e.g., a millimeter wave radar device), and the like. In the illustrated embodiment, the interface 905 includes a millimeter wave radar device 1115 (see fig. 11C). The bottom surface of interface housing 1105 may include a second protrusion/recess 1110 to assist in properly mounting and securing interface 905 to housing 125 of robotic lawnmower 105. The interface housing 1105 may also include screw holes 1120 configured to receive screws to secure the interface 905 to the housing 125 of the robotic lawnmower 105. In some examples, the interface 905 may be otherwise secured to the robotic lawnmower 105 in addition to or instead of using screws. The bottom surface of the interface housing 1105 may also include a through hole 1125 to allow wires and/or connectors from the robotic lawnmower 105 to connect to one or more components within the interface 905 when the interface 905 is installed in the compartment 1005.
Fig. 11C and 11D illustrate an interface 905, wherein interface housing 905 is shown transparently to allow the internal components of interface 905 to be seen. As shown in the example of fig. 11C and 11D, the interface 905 may include a Printed Circuit Board (PCB) 1130 that is mounted in an upright orientation and held by a bracket 1135 on a base 1140 of the interface 905. Millimeter-wave radar device 1115 may be mounted on a forward surface of PCB 1130. Interface connector 1145 may be mounted on a rearward surface of PCB 1130. In some examples, interface connector 1145 may be mounted on the other side of PCB 1130 (i.e., the forward surface of PCB 1130). Additional components may be mounted on both sides of the PCB 1130 as shown.
Although examples of interface 905 are shown in fig. 11A-11D to include millimeter wave radar device 1115, in other examples interface 905 may additionally or alternatively include a camera and/or another type of sensor. In some examples, any of the plurality of removable attachment interfaces 905 with different sensing devices may be electrically coupled (via the electronic processor 205 electrically connected to the robotic lawnmower 105) and mechanically coupled to the robotic lawnmower 105.
In some examples, interface 905 may include an accessory for a securing device 1205 configured to hold/secure a device (e.g., external device 115) having a camera or other sensor to robotic lawnmower 105. For example, the external device 115 may be secured to the robotic lawnmower 105 during a virtual boundary creation procedure as explained in more detail below. The fixation device 1205 may include a rod 1210 that is inserted into an accessory (e.g., a threaded compartment, a snap-fit compartment, etc.) on the interface 905. The stem 1210 may include an adjustable retention bracket having a clamping device 1215 configured to adjustably clamp/retain the external device 115. In some embodiments, the rod 1210 and/or the retention bracket may fold/telescope into an area of the housing 125 such that the rod 1210 does not protrude (or protrudes less) from the housing 125 when the fixture 1205 is not in use. In some embodiments, the stem 1210 and/or the retention bracket may be removably attached to the interface 905 attached to the housing 125. Other structures and ways of securing a device with a camera (e.g., external device 115) to robotic lawnmower 105 may be used. For example, the housing 125 (e.g., compartment 1005) of the robotic lawnmower 105 may include accessory structures other than accessories located on the interface 905. As another example, the fixation device 1205 on the interface 905 may include a cavity configured to receive and hold/fix the external device 115 in an upright position without the use of the rod 1210.
The interface 905 fixed to the robotic lawnmower 105 (e.g., the external device 115) and/or the device (e.g., the external device 115) fixed to the robotic lawnmower 105 directly or via the interface 905 may be in two-way communication with the robotic lawnmower 105 (e.g., with the first electronic processor 205) via a wired or wireless connection as previously explained herein. For example, the external device 115 may provide image data of the captured image to the robotic lawnmower 105 for use by the robotic lawnmower 105 in generating the virtual boundary. As another example, the external device 115 may receive location information from the robotic lawnmower 105 for use by the external device 115 in generating the virtual boundary. In some examples, any device 105, 115, 145, 152, or a combination thereof, may generate virtual boundaries based on information captured by the device 105, 115, 145, 152 itself and/or based on information received from other devices 105, 115, 145, 152. In some instances, information may be shared between devices 105, 115, 145, 152 of communication system 100 to allow any one or a combination of devices 105, 115, 145, 152 to generate virtual boundaries. In some examples, when the interface 905 is mounted on the robotic lawnmower 105, the sensing device(s) of the interface 905 may be considered to be part of the robotic lawnmower 105.
In some embodiments, the inner housing 125B includes at least two boundary cable sensors in the form of electromagnetic field sensors configured to detect electromagnetic signals emitted by the boundary cable. For example, an electromagnetic field sensor may be capable of detecting the strength and/or polarity of an electromagnetic signal from a border cable.
In some embodiments, the inner housing 125B includes a ranging sensor (e.g., one or more hall sensors or other types of sensors) for each motor drive wheel 130A. The first electronic processor 205 may use data from the ranging sensors to determine how far each wheel 130A has rotated and/or the rotational speed of each wheel 130A in order to accurately control the movement (e.g., turning ability) of the robotic lawnmower 105. For example, the first electronic processor 205 may control the robotic lawnmower 105 to move in an approximately straight line by controlling the two wheel motors 235A and 235B to rotate at approximately the same speed. As another example, the first electronic processor 205 may control the robotic lawnmower 105 to turn and/or pivot in a particular direction by controlling one of the wheel motors 235A or 235B to rotate faster than the other of the wheel motors 235A or 235B or in an opposite direction than the other. Similarly, rotating only one of the wheel motors 235A or 235B and not the other wheel motor 235A or 235B should cause the robotic lawnmower 105 to turn/pivot.
In some embodiments, the inner housing 125B includes a cutting blade assembly motor sensor (e.g., one or more hall sensors or other types of sensors). The first electronic processor 205 may use data from the cutting blade assembly motor sensor to determine the rotational speed of the cutting blade assembly 135.
In some embodiments, battery 245 provides power to first electronic processor 205 and other components of robotic lawnmower 105 (such as motors 235A, 235B, 240 and first display 225). In some embodiments, other components besides the first electronic processor 205 may be powered by the first electronic processor 205 or directly. In some embodiments, when power is provided directly from the battery 245 to other components, the first electronic processor 205 may control whether power is provided to one or more of the other components using, for example, a respective switch (e.g., a field effect transistor) or a respective switching network including a plurality of switches. In some embodiments, the robotic lawnmower 105 includes active and/or passive conditioning circuitry (e.g., a buck controller, a voltage converter, a rectifier, a filter, etc.) to condition or control power received by components of the robotic lawnmower 105 (e.g., the first electronic processor 205, the motors 235A, 235B, 240, etc.) from the battery 245. In some embodiments, the battery 245 is a removable battery pack. In some embodiments, the battery 245 is configured to receive a charging current from the docking station 110 when the robotic lawnmower 105 is docked at and electrically connected to the docking station 110.
Fig. 3 is a block diagram of an external device 115 according to some example embodiments. In the illustrated example, the external device 115 includes a second electronic processor 305 electrically connected to a second memory 310, a second network interface 315, a second user input device 320, a second display 325, and a phaseAnd a machine 330. These components are similar to the similarly named components of robotic lawnmower 105 explained above with respect to fig. 2 and function in a similar manner as described above. For example, the second display 325 may also be used as an input device (e.g., when the second display 325 is a touch screen). In some embodiments, the second network interface 315 includes one or more transceivers for wirelessly communicating with the robotic lawnmower 105 (e.g., configured to communicate via Bluetooth TM 、WiFi TM And the like). The second network interface 315 may include an additional transceiver for wireless communication with the server 152 via, for example, cellular communication. The second network interface 315 may also include a second GPS receiver configured to receive position signals from one or more satellites 150. In some embodiments, at least some of the transceivers and/or receivers of the external device 115 may combine or share some elements (e.g., antennas and/or other hardware). In some embodiments, the second electronic processor 305 sends and receives data to and from the robotic lawnmower 105 and/or other devices of the communication system 100 via the second network interface 315.
In some embodiments, the second GPS receiver of the external device 115 may be a different type of GPS receiver than the first type of GPS receiver of the robotic lawnmower 105. For example, the second GPS receiver may not be an RTK GNSS receiver, but the first type of GPS receiver of the robotic lawnmower 105 is an RTK GNSS receiver. In such an embodiment, a first type of GPS receiver (i.e., an RTK GNSS receiver) of the robotic lawnmower 105 may enable a more accurate positioning determination than a second GPS receiver of the external device 115. Thus, incorporating the position of the robotic lawnmower 105 as determined by the robotic lawnmower 105 into the method of creating a virtual boundary will result in a more accurate virtual boundary than creating a virtual boundary using a method that merely tracks its own position using the external device 115. The method described below (e.g., method 500 of fig. 5) may utilize more accurate positioning determinations made by robotic lawnmower 105 without requiring user 605 to manually move robotic lawnmower 105 around the boundary of work area 155 during creation of the virtual boundary. Instead, the handheld external device 115 may move around the boundary of the work area 155, or the user may move himself around the boundary of the work area 155 with or without fiducial markers or other devices to create a virtual boundary, as explained in more detail below. Thus, the accuracy and user friendliness of the creation of virtual boundaries are improved.
In some embodiments, the camera 330 is an optical camera (e.g., a depth of field (DoF) camera) configured to capture multiple images (e.g., multiple single images, each image captured in response to user input received via the second input device 320) or a series of consecutive images/frames in the form of video. In some embodiments, each image of the plurality of images may include fiducial markers 160 of robotic lawnmower 105 and may be captured to create a virtual boundary for robotic lawnmower 105 as described in detail below. In some embodiments, the camera 330 may capture an image of the user and/or a fiducial marker carried by the user with the external device 115 mounted/secured to the robotic lawnmower 105, as described in more detail herein. In some examples, camera 330 may be a 360 degree camera that may capture 360 degree images around robotic lawnmower 105 without robotic lawnmower 105 moving. In some examples, the camera 330 may not be a 360 degree camera, and the robotic lawnmower 105 may control itself to move such that the camera 330 is able to capture images of moving objects of 360 degrees around the robotic lawnmower 105 during the virtual boundary creation process (e.g., by performing a "follow" action described in more detail herein).
In some embodiments, the external device 115 includes fewer or more components in a different configuration than that shown in fig. 3. For example, the external device 115 may include a battery, another GPS receiver, and the like. In some embodiments, the external device 115 performs functions other than those described below.
Fig. 4 is a block diagram of a base station device 145 according to some example embodiments. In the illustrated example, the base station device 145 includes a third electronic processor 405 electrically connected to a third memory 410, a third network interface415 and a third user input device 420. These components are similar to the similarly named components of robotic lawnmower 105 explained above with respect to fig. 2 and function in a similar manner as described above. In some embodiments, third network interface 415 includes one or more transceivers (e.g., configured to communicate via Bluetooth) for wirelessly transmitting information (e.g., calibration information) to robotic lawnmower 105 TM 、WiFi TM A third RF transceiver that communicates with, etc.) to assist the robotic lawnmower 105 in determining the current position of the robotic lawnmower 105 during a mowing operation, as explained in more detail below. The third network interface 415 may include additional transceivers for wireless communication with the server 152 via, for example, cellular communication. The third network interface 415 may also include a third GPS receiver (e.g., a second RTK GNSS receiver) configured to receive position signals from one or more satellites 150. In some embodiments, at least some of the transceivers and/or receivers of base station device 145 may combine or share some elements (e.g., antennas and/or other hardware). In some embodiments, third electronic processor 405 sends and receives data to and from robotic lawnmower 105 and/or other devices of communication system 100 via third network interface 415. In some embodiments, the third input device 420 is a button or switch configured to be actuated by a user.
In some embodiments, base station device 145 includes fewer or more components in a different configuration than that shown in fig. 4. For example, the base station device 145 may include a battery, a display or indicator (e.g., a light emitting diode) for providing information to the user, and the like. As another example, in some embodiments, base station device 145 may not include input device 420. As yet another example, the base station device 145 may include a camera (e.g., similar to the camera 330 described above with respect to fig. 3). For example, the camera of the base station device 145 may include a 360 degree camera. In some embodiments, base station device 145 performs functions other than those described below.
In some embodiments, satellite 150 and server 152 include elements similar to elements that function in a similar manner as described above with respect to devices 105, 115, and 145. For example, satellite 150 and server 152 may each include an electronic processor, memory and network interfaces, and other elements.
In some embodiments, robotic lawnmower 105 travels within the virtual boundary of work area 155 to perform tasks (e.g., cut lawns). The robotic lawnmower 105 can travel randomly within the work area 155 defined by the virtual boundary. For example, the robotic lawnmower 105 may be configured to travel along an approximately straight line until the robotic lawnmower 105 determines that it has reached a virtual boundary. In response to detecting the virtual boundary, the robotic lawnmower 105 may be configured to turn in a random direction and continue traveling in an approximately straight line along the new path until the robotic lawnmower 105 again determines that it has reached the virtual boundary, at which point the process is repeated. In some embodiments, robotic lawnmower 105 may travel in a predetermined pattern within work area 155 defined by the virtual boundary (e.g., along adjacent rows or columns between two sides of the virtual boundary) to more efficiently and uniformly cut lawns within work area 155. In such embodiments, robotic lawnmower 105 may determine and track its current location within work area 155.
For example, as indicated in fig. 1A and 1B, both the robotic lawnmower 105 and the stationary base station device 145 may be configured to communicate with each other and with one or more satellites 150. In some embodiments, both the robotic lawnmower 105 and the base station device 145 may include an RTK GNSS receiver. During a mowing operation, as the robotic lawnmower 105 moves within the work area 155, the robotic lawnmower 105 may determine its current position based on position signals received from one or more satellites 150 via its RTK GNSS receiver and based on calibration information received from the base station device 145 regarding the same position signals received by the RTKGNSS receiver of the stationary base station device 145.
For example, during a mowing operation, the base station apparatus 145 can be stationary (i.e., act as a stationary base station) while the robotic lawnmower 105 is moving within the work area 155. Both robotic lawnmower 105 and base station device 145 may receive one or more position signals from one or more satellites 150 (e.g., from at least four common satellites 150). The base station device 145 may determine calibration information about the received position signal, such as phase information of the position signal received by the base station device 145. The base station device 145 may transmit the calibration information to the robotic lawnmower 105 that receives the same one or more position signals from the one or more satellites 150. The robotic lawnmower 105 can then compare the phase information of the position signal received by the base station device 145 with the phase information of the position signal received by the robotic lawnmower 105 to help the robotic lawnmower 105 determine the current position of the robotic lawnmower 105 (e.g., using RTK GNSS principles). Thus, the stationary base station device 145 provides a reference for the robotic lawnmower 105 to more accurately determine the position of the robotic lawnmower 105 than if the robotic lawnmower 105 had determined its position based solely on the position signals received from the one or more satellites 150. More accurately determining the position of the robotic lawnmower 105 allows the robotic lawnmower 105 to better navigate itself within the work area 155 (e.g., within or along a virtual boundary).
In some examples, robotic lawnmower 105 and base station device 145 may operate in opposite roles to those described immediately above. For example, in the event that the base station device 145 moves from its resting position (e.g., during some type of virtual boundary creation), the base station device 145 may receive calibration information (i.e., position calibration information) from the robotic lawnmower 105 placed in the resting position. Based on its own received position signals from the satellites 150 and calibration information from the robotic lawnmower 105 (i.e., based on the position signals received by the stationary robotic lawnmower 105 from the satellites 150), the base station device 145 may determine its position (e.g., using RTK GNSS principles).
Although the robotic lawnmower 105 may determine its current location based at least in part on the calibration information received from the base station device 145 as described above, in some embodiments the robotic lawnmower 105 may determine its current location without using the information received from the base station device 145. In some embodiments, base station device 145 may not be included in system 100. In some embodiments, robotic lawnmower 105 may determine its current location based solely on the position signals received from one or more satellites 150.
There are a number of ways to create/generate virtual boundaries for robotic tools. For example, a virtual boundary may be established by manually moving the robotic tool (i.e., "walking a dog") over a desired path while the robotic tool stores the desired path. However, this approach is not very efficient because the user has to manually move the robotic tool around the work area. As another example, the virtual boundary may be automatically created by letting the robotic tool move randomly across the work surface and collecting multiple trajectories as it moves randomly. However, this approach requires complex calculations and in many cases (such as lawns with waters (e.g., lakes or ponds) or other segmented/separated areas) may not accurately generate virtual boundaries. Accordingly, there is a technical problem with creating accurate virtual boundaries for robotic garden tools in an efficient manner that does not burden the user.
The systems, methods, and devices described herein address the above stated technical problems by using multiple devices to determine the exact location of the device used to create the virtual boundary. In addition, some of the systems, methods, and devices described herein use fiducial markers 160 on robotic garden tool 105 configured to be captured in a plurality of images taken by a user using a camera of external device 115 to create a virtual boundary as the user moves along the boundary of work area 155. Some systems, methods, and devices described herein relate to robotic garden tool 105 determining relative distances (i.e., vectors) from a plurality of waypoints along a virtual boundary by detecting a user, a fiducial marker carried by a user, or a signal emitted by a device carried by a user. The embodiments described herein are able to create virtual boundaries more efficiently, because, for example, the robotic garden tool 105 does not need to be manually moved by a user during creation of the virtual boundaries. Instead, a more user-friendly device (or the user himself) (e.g., an external device 115 such as a smart phone 115) that is easier for the user to carry and move moves around the work area 155 to create a virtual boundary while still taking advantage of the increased positional accuracy that the robotic lawnmower 105 can provide.
Fig. 5 illustrates a flow chart of a method 500 that may be performed by the first electronic processor 205 and another electronic processor (e.g., the second electronic processor 305 of the external device 115) of the robotic lawnmower 105 to create a virtual boundary to limit the robotic lawnmower 105 during operation of the robotic lawnmower. Although a particular order of processing steps, signal reception and/or signal transmission is indicated in fig. 5 as an example, the timing and order of such steps, reception and transmission may be varied where appropriate without negating the objects and advantages of the examples set forth in detail throughout the remainder of this disclosure. The following explanation is primarily directed to the robotic lawnmower 105 and external device 115, such as the smartphone 115, performing the steps of the method 500 to create a virtual boundary. In addition, the base station apparatus 145 may facilitate performance of the method 500 by providing a reference location to the robotic lawnmower 105 to allow the robotic lawnmower 105 to more accurately determine the location of the robotic lawnmower 105. As described above, the base station device 145 can be regarded as one type of external device 115. However, in some embodiments, base station device 145 is configured to remain stationary at the base station location during creation of the virtual boundary. Accordingly, an external device 115 (e.g., smart phone 115) other than base station device 145 may be configured to perform the steps of method 500, explained below, performed by external device 115.
At block 505, the first electronic processor 205 of the robotic lawnmower 105 determines a position of the robotic lawnmower 105. For example, as previously explained herein with respect to movement of the robotic lawnmower 105 during operation (e.g., during a mowing operation), the robotic lawnmower 105 uses the position signals received by its RTKGNSS receiver from the one or more satellites 150 and calibration information received from the stationary base station device 145 regarding the position signals received by its RTK GNSS receiver to determine the current position of the robotic lawnmower 105. As another example, robotic lawnmower 105 may determine its location based solely on position signals received from one or more satellites 150.
In embodiments including the base station device 145, the base station device 145 may be configured to receive the position signal from the satellite 150 (e.g., the second RTK GNSS receiver via the third network interface 415) and transmit calibration information regarding the position signal to the robotic lawnmower 105 (e.g., the second RF transceiver via the third network interface 415). In some embodiments, the third electronic processor 405 of the base station device 145 determines calibration information about the first location signal (or about a plurality of first location signals). The calibration information may include phase information of the first position signal (e.g., a phase of a carrier of the first position signal) and a clock signal of the first position signal. In some embodiments, the first position signal comprises a continuous signal transmitted by the satellite 150 for receipt by one or more devices including an RTK GNSS receiver. In some embodiments, the first position signal comprises periodic transmissions of a plurality of individual signals. In some embodiments, the plurality of satellites 150 each transmit position signals that are received by the base station device 145 and the robotic lawnmower 105. In such an embodiment, the third electronic processor 405 may determine the location of the base station device 145 by averaging the results of the plurality of location signals from the plurality of satellites 150. Such averaging may increase the reliability of the location determination of the stationary base station device 145 by the third electronic processor 405. In some embodiments, the third electronic processor 405 may additionally or alternatively determine the location of the base station device 145 by averaging a plurality of location signals received from the respective satellites 150 over a period of time. Such additional or alternative time averaging may increase the reliability of the location determination of the stationary base station apparatus 145 by the third electronic processor 405.
The first electronic processor 205 of the robotic lawnmower 105 may be configured to receive the position signal from the satellite 150 (e.g., a first RTK GNSS receiver via the first network interface 215). The first position signal received by the first electronic processor 205 may be the same signal received by the third electronic processor 405 of the robotic lawnmower 105, but there may be a phase difference between the two received position signals that is based on the difference in position between the base station device 145 and the robotic lawnmower 105. The first electronic processor 205 may also be configured to receive calibration information from the base station device 145 (e.g., a first RF transceiver via the first network interface 215). The first electronic processor 205 may be further configured to determine the current position of the robotic lawnmower 105 based on: (i) a position signal and (ii) calibration information. For example, the first electronic processor 205 may determine the amount of time it takes for the position signal to travel from the satellite 150 to the robotic lawnmower 105. In some embodiments, the first electronic processor 205 is configured to compare the first phase information included in the calibration information from the base station device 145 to the second phase information of the first position signal received by the robotic lawnmower 105 in accordance with the RTK GNSS principles to help determine the current position of the robotic lawnmower 105.
In some embodiments, the first electronic processor 205 may determine/track multiple locations of the robotic lawnmower 105 as the robotic lawnmower 105 moves in the work area 155 during creation of the virtual boundary. In some embodiments, the first electronic processor 205 may determine the current location of the robotic lawnmower 105 and time stamp each of the plurality of locations of the robotic lawnmower 105 continuously or periodically (at predetermined time intervals, such as every 100 milliseconds, every 500 milliseconds, every second, etc.). The plurality of locations and associated time stamps of robotic lawnmower 105 can be saved by first electronic processor 205 in first memory 210 and/or can be transferred to another device (e.g., server 152, external device 115, etc.) for storage and/or use.
At block 510, the second electronic processor 305 of the external device 115 (e.g., smart phone 115) controls a data capture device (e.g., camera 330, object detection device, etc.) of the external device 115 to capture a plurality of data samples as the external device 115 moves in the work area 155. Each of the plurality of data samples may include data indicative of a relative position of the robotic lawnmower 105 with respect to the external device 115, which may be determined using a first positioning technique (e.g., image/data analysis based on an approximate expected shape and size of the robotic lawnmower 105 or an expected design of the fiducial marker 160, radar-based analysis technique, etc.). Multiple data samples may be captured by any one or combination of different devices. A number of example data capture devices and types of data samples captured by each example data capture device are provided immediately below.
In some examples, the second electronic processor 305 is configured to receive a plurality of images captured by the camera 330 as the target moves in the work area 155. Each of the plurality of images may include a fiducial marker 160 of the robotic lawnmower 105. The plurality of images may comprise a plurality of single images, each captured in response to user input, or may comprise a series of successive images/frames in the form of video.
In some examples, second electronic processor 205 is configured to receive a plurality of data samples captured by millimeter wave radar devices of external device 115 as the target moves in work area 155. Each of the plurality of data samples may include data indicative of a respective position of the robotic lawnmower 105 relative to the external device 115 (i.e., a distance between the robotic lawnmower 105 and the external device 115).
In some examples, the second electronic processor 305 is configured to receive a plurality of data samples captured by a receiver of the second network interface 315 from the robotic lawnmower 105 configured to transmit a beacon signal for determining distance and direction as the user moves in the work area. Each of the plurality of data samples may include data indicative of a respective position of the robotic lawnmower 105 relative to the external device 115 (i.e., a distance between the robotic lawnmower 105 and the external device 115). For example, the receiver may include one or more directional antennas such that the second electronic processor 305 may determine a distance between the robotic lawnmower 105 and the external device 115 carried by the user (e.g., based on a Received Signal Strength Indication (RSSI) of the beacon signal) and a direction from which the beacon signal is received.
In some examples, the second electronic processor 305 is configured to receive a plurality of data samples captured by a laser rangefinder or other ranging device of the external device 115 as the user moves in the work area 155. Each of the plurality of data samples may include data indicative of a respective position of the robotic lawnmower 105 relative to the external device 115 (i.e., a distance between the robotic lawnmower 105 and the external device 115). In some instances, the laser rangefinder may not be configured to identify/authenticate a different target by itself (as may be done using the data capture device included in the examples above), but may be controlled such that distance measurements of the robotic lawnmower 105 are obtained by a user aligning the laser rangefinder of the external device 115 such that the laser rangefinder faces the robotic lawnmower 105.
While the external device 115 may use any one or combination of a number of data capture devices to capture data indicative of the respective positions of the robotic lawnmower 105 relative to the external device 115 as the user moves the external device 115 around the perimeter of the work area 115, the following explanation of fig. 5 and 6 primarily relates to use cases in which the data capture device is a camera 330 and the captured data samples are images/frames of video captured by the camera 330.
As shown in the example use case of fig. 6, user 605 may move around the boundary of work area 155 while capturing an image of fiducial marker 160 with camera 330 to create virtual boundary 610. Fig. 6 shows three example locations of user 605 where user 605 instructs external device 115 (via user input) to capture a single image while pointing camera 330 of external device 115 at fiducial marker 160 of robotic lawnmower 105. In some embodiments, camera 330 may be controlled to capture video of fiducial marker 160 as user 605 moves along the boundary of work area 155. In some embodiments, the second electronic processor 305 may continuously or periodically (at predetermined time intervals, such as every 100 milliseconds, every 500 milliseconds, every second, etc.) mark or store one or more still images/frames of video and time stamp the still images/frames to indicate when they were captured.
Blocks 505 and 510 may be repeated until user 605 has captured data around the boundary of job area 155 as desired, as indicated by the dashed line returning from block 510 to block 505 in fig. 5. In some embodiments, the second electronic processor 305 receives user input from the user 605 via the second input device 320 indicating that all of the desired captured data has been captured (e.g., indicating that the user 605 has captured an image from a location around the entire closed boundary).
Once all of the desired data has been captured, at block 515, a plurality of locations of the external device 115 may be determined using a first positioning technique based on any one or a combination of captured data samples from the data capture devices previously described herein. In some examples, the plurality of locations of the external device 115 are determined using a plurality of data samples based on: (i) The relative position of robotic lawnmower 105 in a respective data sample of the plurality of data samples (e.g., as determined using a first positioning technique that can determine the relative positioning between two targets); and (ii) the absolute position of the robotic lawnmower 105 at a time corresponding to the time the corresponding data sample was captured (e.g., as determined by the RTK GNSS receiver of the robotic lawnmower 105). For example, the plurality of locations of the external device 115 are determined using the plurality of images based on: (i) The position of fiducial marker 160 in a respective image of the plurality of images; and (ii) the position of robotic lawnmower 105 at a time corresponding to the time at which the respective image was captured. For example, using the position of robotic lawnmower 105 determined at a given time and the position and orientation of fiducial marker 160 in the respective image corresponding to the given time, a camera pose estimate may be made to determine the position of external device 115 (i.e., the distance between robotic lawnmower 105 and external device 115) at the time the respective image was captured.
In some instances, fiducial markers 160 may not be used or may not be identifiable by certain data capture devices. In such instances, the data capture device of the external device 115 may identify the robotic lawnmower 105 in the captured data samples using image/data analysis based on the approximate expected shape and size of the robotic lawnmower 105, radar-based analysis techniques, and so forth.
The server 152, the electronic processors 205, 305, 405 of any device, or a combination thereof may perform block 515 to determine one or more of the plurality of locations of the external device 115. In some embodiments, information may be shared between devices of communication system 100 to allow different devices to perform the steps of method 500, such as determining one or more of the plurality of locations of external device 115.
For example, the second electronic processor 305 of the external device 115 may perform block 515 by receiving from the robotic lawnmower 105 a plurality of positions of the robotic lawnmower 105 as determined by the robotic lawnmower 105 (at block 505). As previously described herein, each of the plurality of locations of the robotic lawnmower 105 may include a respective timestamp that is also received by the external device 115. The second electronic processor 305 may then determine the plurality of locations of the external device 115 based on: (i) The position of fiducial marker 160 in a respective image of the plurality of images; and (ii) the position of robotic lawnmower 105 at a time corresponding to the time at which the respective image was captured. As is apparent from the above explanation, the plurality of locations (and associated time stamps) of the robotic lawnmower 105 received from the robotic lawnmower 105 are used to determine the location of the robotic lawnmower 105 at a time corresponding to the time at which the respective image was captured.
As another example, a device other than the external device 115 may perform block 515 to determine one or more of the plurality of locations of the external device 115. In some embodiments, the second electronic processor 305 of the external device 115 may timestamp each of the plurality of images with a time corresponding to the time the corresponding image was captured. The second electronic processor 305 may also transmit the plurality of images and the respective associated timestamps to at least one of the robotic lawnmower 105 and a remote device (e.g., the server 152, the base station device 145, another external device 115, etc.). In some embodiments, at least one of the first electronic processor 205 of the robotic lawnmower 105 and the other electronic processor of the remote device is configured to determine a plurality of positions of the external device 115 (at block 515).
In some embodiments, at block 515, one or more of the plurality of locations of the external device 115 may be discarded, for example, if they are the same as other locations. For example, when the external device 115 is located at the same location, a large number of images may be captured. Accordingly, redundant locations may be discarded such that only one of the redundant locations is stored as a path point (at block 520).
At block 520, each of the plurality of locations of the external device 115 is stored as a waypoint. The server 152, the electronic processors 205, 305, 405 of any device, or a combination thereof may perform block 520 to store the location of the external device 115 as a waypoint.
At block 525, virtual boundaries 610 are generated using the path points. The server 152, the electronic processors 205, 305, 405 of any device, or a combination thereof may execute block 525 to generate the virtual boundary 610 using the waypoints. For example, the waypoint data may be shared between the devices such that any device may generate virtual boundary 610. Virtual boundary 610 may be generated by connecting adjacent waypoints using an approximate straight line to create enclosed job area 155. In some embodiments, a cubic spline is used to smooth lines between adjacent waypoints or comprising a plurality of waypoints. In some embodiments, the device generating virtual boundary 610 may determine that the waypoints are redundant and/or that one or more waypoints are within or outside of the enclosed area defined by the remaining waypoints. In response to identifying such waypoints, the device may remove the waypoints from the determination to generate virtual boundary 610. In addition, the user 605 may selectively remove the waypoints as desired. For example, the smartphone 115 may retrieve the waypoint from the second memory 310 or receive the waypoint from another device and may display the location of the waypoint on the second display 325. In response to user input on the second display 325, the smartphone 115 may delete the waypoint selected by the user 605.
When virtual boundary 610 is generated by a device other than robotic lawnmower 105, the device generating virtual boundary 610 may transmit information indicative of virtual boundary 610 to robotic lawnmower 105. The robotic lawnmower 105 (specifically, the first electronic processor 205) may be configured to use the information indicative of the virtual boundary 610 and the determined current position of the robotic lawnmower 105 to control the robotic lawnmower 105 to remain in the work area 155 during operation of the robotic lawnmower 105 (e.g., during a mowing operation).
In some embodiments, the smartphone 115 may provide an extended user interface with respect to other devices in the system 100 (e.g., the robotic lawnmower 105). In some embodiments, the collected waypoints and/or virtual boundaries 610 may be displayed on the second display 325. In some embodiments, the second electronic processor 305 of the smartphone 115 may receive, via the second display 325, user input indicating whether certain waypoints and/or portions of the virtual boundary 610 correspond to obstacles or the like within the perimeter virtual boundary 610 around the edge of the work area 155.
In some embodiments, the method 500 may be repeated to generate more than one virtual boundary and/or modify existing virtual boundaries. For example, a perimeter virtual boundary 610 may be created at an outer edge of the work area 155 to define the work area 155 in which the robotic lawnmower 105 should operate. One or more additional virtual boundaries may be created within perimeter virtual boundary 610 in a similar manner, for example, to enclose an object/region within the main virtual boundary in which robotic lawnmower 105 should not operate. For example, such objects/regions may include one or more trees, swimming pools, garden boundaries, flower beds, and the like. As described above, in some embodiments, the second electronic processor 305 of the smartphone 115 may receive, via the second display 325, user input indicating whether certain waypoints and/or portions of virtual boundaries (e.g., additional virtual boundaries) correspond to obstacles within the perimeter virtual boundary 610. Additionally or alternatively, the device generating the virtual boundary may determine that the waypoints of the additional virtual boundary lie within the perimeter virtual boundary 610. In response to this determination and based on the assumption that the user desires to define a "forbidden" zone, the device generating the virtual boundary may generate an additional virtual boundary such that robotic lawnmower 105 is configured to be remote from a second region within the additional virtual boundary. In other words, the virtual boundary may be generated such that robotic lawnmower 105 stays within perimeter virtual boundary 610 and outside of the additional virtual boundary. In some embodiments, this area between virtual boundaries in which robotic lawnmower 105 is configured to travel may be referred to as work area 155. In other embodiments, the job area 155 may include the entire area within the perimeter virtual boundary 610 (e.g., including the "forbidden" area).
Although the above examples explain generating additional virtual boundaries within perimeter virtual boundary 610 after perimeter virtual boundary 610 has been generated, in some embodiments, a device generating virtual boundaries may generally determine whether an area to be mowed by robotic lawnmower 105 is within a virtual boundary or outside a virtual boundary based on the relative relationship of virtual boundaries that have been generated. For example, in contrast to the examples described above, additional virtual boundaries may be generated prior to perimeter virtual boundary 610. In this case, the device generating the virtual boundary may determine that the path point of the perimeter virtual boundary 610 generated later is located outside the additional virtual boundary previously made. In response to this determination and based on the assumption that the user desires to define a "forbidden" zone using the set of internal path points, the device generating the virtual boundary may generate additional virtual boundaries using the internal path points and perimeter virtual boundaries 610 using the external path points. In some embodiments, the function/purpose of each set of waypoints may be adjusted/controlled via user input on the smartphone 115, as previously explained herein.
In some embodiments, a Graphical User Interface (GUI) on the second display 325 may display user selectable buttons that enable/disable image capture during creation of the virtual boundary 610. For example, actuation of a user selectable button may enable a "virtual boundary creation" mode of the external device 115 to perform at least some steps of the method 500. In response to entering the "virtual boundary creation" mode, the external device 115 may send a notification/command to other devices in the system 100 (e.g., the robotic lawnmower 105) to indicate that the user 605 intends to create the virtual boundary 610. In response to receiving the notification/command from the external device 115, the robotic lawnmower 105 may control operation of the at least one wheel motor 235 to control movement of the robotic lawnmower 105 such that the robotic lawnmower 105 moves toward the external device 115 as the external device 115 moves in the work area 155. For example, the robotic lawnmower 105 may be configured to "follow" the external device 115 as the user 605 moves the external device 115. The "follow-up" action of robotic lawnmower 105 may help ensure that fiducial markers 160 are sufficiently included in the plurality of images captured by external device 115 so that a camera pose estimate may be determined based on each of the plurality of images.
In some embodiments, the robotic lawnmower 105 may be configured to "follow" the external device 115 by receiving the current location of the external device 115 from the external device 115 (e.g., by RF communication via an RF transceiver). The current location of the external device 115 may be a relative positioning of the external device 115 with respect to the robotic lawnmower 105 (e.g., as determined by a data capture device of the external device 115 using a first positioning technique) and/or may be an absolute location of the external device 115 (e.g., as determined by a Global Positioning System (GPS) receiver of the external device 115). For example, as previously explained herein, the external device 115 may include a GPS receiver configured to receive data for determining a current location of the external device 115. The robotic lawnmower 105 may also include its own GPS receiver. Thus, the first electronic processor 205 of the robotic lawnmower 105 can control movement of the robotic lawnmower 105 toward the current position of the external device 115. In some embodiments, robotic lawnmower 105 controls itself to move a predetermined distance (e.g., 1-3 meters) from external device 115 to be close enough to external device 115 so that fiducial markers 160 are sufficiently visible in the captured image, but far enough away from external device 115 so as not to crowd to user 605.
In some embodiments, robotic lawnmower 105 may be configured to "follow" external device 115 by: a Received Signal Strength Indication (RSSI) of the signal output by the external device 115 is determined and movement of the robotic lawnmower 105 is controlled such that the RSSI of the signal output by the external device 115 is equal to or above a predetermined RSSI threshold. For example, when external device 115 enters a "virtual boundary creation" mode, external device 115 may receive user input indicating that user 605 intends to move external device 115 in a clockwise manner around the boundary of job area 155. The external device 115 may transmit information to the robotic lawnmower 105 to indicate that the user 605 intends to move the external device 115 in a clockwise manner around the boundary of the work area 155. In embodiments where fiducial marker 160 is located on a front top surface of housing 125, user 605 may initially set robotic lawnmower 105 to face user 605. When the user 605 moves the external device 115 around the boundary, the external device 115 may output a beacon signal, for example, from the second network interface 315. The robotic lawnmower 105 may remain stationary as long as the RSSI of the beacon signal received by the robotic lawnmower 105 is at or above the predetermined RSSI threshold. In some embodiments, robotic lawnmower 105 may include multiple sensors/receivers configured to receive the beacon signal and may be able to determine a direction from which the beacon signal is received. In such an embodiment, the first electronic processor 205 of the robotic lawnmower 105 can control the robotic lawnmower 105 to pivot in place to face in the direction of the external device 115. In response to the first electronic processor 205 determining that the RSSI of the beacon signal has decreased below the predetermined RSSI threshold, the first electronic processor 205 controls the robotic lawnmower 105 to gradually turn clockwise and/or gradually move forward until the RSSI of the beacon signal increases to be equal to or above the predetermined RSSI threshold. This control of the robotic lawnmower 105 may be repeated during the virtual boundary creation process until the external device 115 exits the "virtual boundary creation" mode and sends a notification/command to the robotic lawnmower 105 indicating that the virtual boundary 610 is no longer created.
In some embodiments, the notification/command from the external device 115 to the robotic lawnmower 105 indicating that the external device 115 is being used to create the virtual boundary 610 includes instructions on how to control the robotic lawnmower 105 to move the robotic lawnmower 105 towards the external device 115. For example, when the GPS receiver of the external device 115 is disabled, the command may instruct the robotic lawnmower 105 to "follow" the external device 115 using the RSSI monitoring process explained above. However, when the GPS receiver of the external device 115 is enabled, the command may instruct the robotic lawnmower 105 to control itself to move to a predetermined distance from the GPS location of the external device 115. The GPS location of the external device 115 may also be included in the command sent to the robotic lawnmower 105. In addition, as indicated by the RSSI example above, the command may indicate whether the user 605 intends to move the external device 115 clockwise or counterclockwise around the boundary of the work area 155.
Although the fiducial marker 160 is described as being located on the housing 125 of the robotic lawnmower 105 in the above embodiments, in some embodiments the fiducial marker 160 may be located on, near, or connected to the base station device 145 (and/or the docking station 110 that may have its own RTK GNSS receiver). In some examples, the fiducial marker 160 may be placed at any other desired location, the location/position of which may be determined by placing (at least temporarily) a device (e.g., robotic lawnmower 105, base station device 145, etc.) with an RTK GNSS receiver at the desired location. In embodiments where fiducial marker 160 is in a stationary position (e.g., not on robotic lawnmower 105), fiducial marker 160 may be larger than fiducial marker 160 on robotic lawnmower 105 to ensure that fiducial marker 160 is adequately captured in the plurality of images by external device 115 as external device 115 moves around the boundary of work area 155. In instances where fiducial marker 160 is located at a location associated with base station device 145, base station device 145 may determine its location for determining a relative location between external device 115 and base station device 145 at block 505. Such an example may be similar in other respects to the example previously described herein with fiducial markers 160 located on robotic lawnmower 105. For example, blocks 510, 515, 520, and 525 of fig. 5 may be substantially similar to that previously described, except that the location of base station device 145 is used instead of the location(s) of robotic lawnmower 105.
In some examples, fiducial marker 160 may not be used. In such instances, the data capture device of the external device 115 may identify the target of interest (e.g., robotic lawnmower 105, base station device 145, docking station 110, etc.) using, for example, image/data analysis based on the approximate expected shape and size of the target of interest. Such an example may be similar in other respects to the example previously described herein using fiducial markers 160 of robotic lawnmower 105. For example, blocks 505, 510, 515, 520, and 525 of fig. 5 may be substantially similar to that previously described, except that the relative position of the object of interest with respect to the external device 115 is determined by using image/data analysis, radar-based analysis techniques, etc., rather than using fiducial markers 160 on the object of interest. The absolute position of the target of interest may be determined by placing (at least temporarily) a device with an RTK GNSS receiver (e.g., robotic lawnmower 105, base station device 145, etc.) at the position of the target of interest or by using the device with an RTK GNSS receiver as the target of interest.
Fig. 7 illustrates a flow chart of a method 700 that may be performed by the first electronic processor 205 of the robotic lawnmower 105 and/or another electronic processor of a device secured to the robotic lawnmower 105 (e.g., the second electronic processor 305 of the external device 115 when the external device 115 is secured to the robotic lawnmower 105 via the securing device 905) to create a virtual boundary to limit the robotic lawnmower during operation of the robotic lawnmower 105. Although a particular order of processing steps, signal reception and/or signal transmission is indicated by way of example in fig. 7, the timing and order of such steps, reception and transmission may be varied where appropriate without negating the objects and advantages of the examples set forth in detail throughout the remainder of this disclosure.
The following explanation is primarily directed to the robotic lawnmower 105 (and/or the external device 115 secured to the robotic lawnmower 105) performing the steps of the method 700 to create a virtual boundary. However, as explained below, in some examples, other devices may perform one or more steps of method 700. For example, the server 152, the electronic processors 205, 305, 405 of any device, or a combination thereof may perform block 720 to generate virtual boundaries using the collected data. For example, the collected data may be shared between devices such that any device may generate virtual boundaries. In addition, the base station device 145 may facilitate performance of the method 700 by providing a reference location to the robotic lawnmower 105 to allow the robotic lawnmower 105 to more accurately determine the location of the robotic lawnmower 105, as explained previously herein. In some examples, method 700 is substantially similar to method 500 of fig. 5, except that robotic lawnmower 105 (and/or external device 115 secured to robotic lawnmower 105) may be used to determine the path points for generating the virtual boundary instead of using external device 115 carried by user 605 to determine the path points for generating the virtual boundary. Accordingly, many aspects of the method 500 described above with respect to fig. 5 (e.g., the "follow-up" action of the robotic lawnmower 105) may be applied to the method 700 explained below.
At block 705, as the target moves in the work area 155 (e.g., as the user 605 moves the target around the boundary to define a virtual boundary), the first electronic processor 205 of the robotic lawnmower 105 determines a plurality of relative distances (i.e., vectors) between the robotic garden tool 105 and the target (e.g., the user 605). As shown in the example use case of fig. 8, user 605 (with or without fiducial markers and/or another device) may move around the boundary of work area 155 while robotic lawnmower 105 captures/determines the relative distance between robotic lawnmower 105 and user 605 in order to create virtual boundary 810. Fig. 8 shows three example locations of user 605 where robotic lawnmower 105 may determine a respective relative distance (i.e., vector) between robotic lawnmower 105 and user 605. In some examples, the first electronic processor 205 may determine the respective relative distances as the user 605 moves along the boundary of the work area 155. In some examples, the first electronic processor 205 may be configured to continuously or periodically (at predetermined time intervals, such as every 100 milliseconds, every 500 milliseconds, every second, etc.) mark or store one or more data samples and/or still images/frames of video and time stamp the data samples and/or still images/frames to indicate when the data samples and/or still images/frames were captured. In instances where the external device 115 is secured to the robotic lawnmower 105, the second electronic processor 305 of the external device 115 may be configured to continuously or periodically tag or store one or more data samples and/or still images/frames of video and time stamp the data samples and/or still images/frames to indicate when the data samples and/or still images/frames were captured.
One or more data samples and/or still images/frames of video may be captured by any one or combination of different devices. A number of example data capture devices and captured data types are provided immediately below.
In some examples, the first electronic processor 205 is configured to receive a plurality of images captured by the camera 250 or 330 as the target moves in the work area 155. Each image of the plurality of images may include the target. The first electronic processor 205 may be configured to determine each of the plurality of relative distances based on the position and orientation of the object in the respective one of the plurality of images. As previously explained herein, the camera 250 or 330 from which the plurality of images are received may be integrated into the housing 125 of the robotic lawnmower 105. Additionally or alternatively, the camera 250 or 330 from which the plurality of images are received may be integrated into an external device 115 that is secured to the robotic lawnmower 105 using a securing device 905.
In some examples, first electronic processor 205 is configured to receive a plurality of data samples captured by millimeter-wave radar devices (i.e., object detection device 255) as an object moves in work area 155. Each of the plurality of data samples may include data indicative of a respective position of the target relative to the robotic lawnmower 105. The first electronic processor 205 may be configured to determine each of the plurality of relative distances based on a respective location of the object in each of the plurality of data samples.
In some examples, the first electronic processor 205 is configured to receive a plurality of data samples captured by a receiver of the first network interface 215 from a device carried by a user, the device configured to transmit beacon signals for range and direction determination as the target moves in the work area 155. Each of the plurality of data samples may include data indicative of a respective position of the target relative to the robotic lawnmower 105. For example, the receiver may include one or more directional antennas such that the first electronic processor 205 may determine a distance between the robotic lawnmower 105 and a device carried by the user (e.g., based on a Received Signal Strength Indication (RSSI) of the beacon signal) and a direction from which the beacon signal is received.
In some examples, the first electronic processor 205 is configured to receive a plurality of data samples captured by a laser rangefinder or other rangefinder device of the robotic lawnmower 105 (or fixed to the robotic lawnmower 105) as the target moves in the work area 155. Each of the plurality of data samples may include data indicative of a respective position of the target relative to the robotic lawnmower 105 (i.e., a distance between the robotic lawnmower 105 and the target). In some examples, the laser rangefinder may not be configured to identify/authenticate a different target by itself, but may be controlled such that distance measurements of the target are obtained by controlling the robotic lawnmower 105 such that the laser rangefinder faces the target. For example, user 605 may carry a beacon device as described in the previous examples to allow robotic lawnmower 105 to determine the direction from which to receive the beacon signal. The first electronic processor 205 may then control the robotic lawnmower 105 to move such that the laser rangefinder faces in the direction from which the beacon signal was received. In some examples, such control of robotic lawnmower 105 is similar to the "follow-up" action of robotic lawnmower 105 described previously herein with respect to fig. 5. In some examples, if the first electronic processor 205 determines that the difference between successive distance measurements of the target is above a predetermined amount (e.g., the distance that the user 605 is unlikely to move in the period of time between successive measurements), the first electronic processor 205 may determine that the laser rangefinder has accidentally determined a distance to a different target that is not the target that is desired to be tracked around the boundary. In such instances, robotic lawnmower 105 may output a notification to indicate that user 605 should examine other targets in work area 155 and/or recalibrate/restart the virtual boundary generation process.
The target may be any of a number of different types of targets or combinations of targets. In some examples, the target may include a fiducial marker carried by the user (e.g., a passive target that does emit a signal). For example, the fiducial mark may be similar to fiducial mark 160 previously described herein, but may be printed on a piece of paper, cardboard, flag, etc., carried by user 605. As another example, fiducial markers may be printed on a shirt or other wearable item worn by user 605. As yet another example, the fiducial markers may be displayed on a tablet computer (i.e., external device 115) carried by the user 605. In some examples, the target may include the human user 605 itself (e.g., a passive target that does not emit a signal) that may or may not carry a fiducial marker or other device. In some examples, the target may be an external device 115 or another device (e.g., an active target that transmits a signal) configured to transmit a beacon signal when the user 605 moves the external device 115 or another device around the boundary. In some examples, robotic lawnmower 105 may not include fiducial markers 160 when performing method 700 (e.g., as indicated by the example use case shown in fig. 8).
In instances where the target comprises a human user 605 and the image is captured by the camera 250 or 330, the first electronic processor 205 may be configured to identify the target (i.e., the human user 605) within each of the plurality of images using image analysis techniques based on the expected shape of the human user 605. Similarly, in instances where the target comprises a human user 605 and the data samples are captured by the millimeter wave radar device, the first electronic processor 205, the millimeter wave radar device, or both the first electronic processor 205 and the millimeter wave radar device may be configured to identify the target within each of the plurality of data samples based on the expected shape of the human user 605. In instances where the target includes a fiducial marker and the image is captured by the camera 250 or 330, the first electronic processor 205 may be configured to identify the target (i.e., the fiducial marker) within each of the plurality of images using image analysis techniques based on the intended design of the fiducial marker.
In some examples, the first electronic processor 205 (or the second electronic processor 305 when the external device 115 is secured to the robotic lawnmower 105) is configured to timestamp each of the plurality of relative distances (and/or each captured data) with a respective time corresponding to a time at which the data that allows the determination of the relative distance was captured. The relative distance and corresponding timestamp may be stored in the first memory 210 (or the second memory 310) and/or transmitted to another device (e.g., the external device 115, the robotic lawnmower 105, a remote device such as the server 152, etc.).
In the same or similar manner as previously described herein with respect to fig. 5, in some examples, the robotic lawnmower 105 may control operation of the at least one wheel motor 235 to control movement of the robotic lawnmower 105 such that the robotic lawnmower 105 moves toward a target (e.g., the user 605 itself and/or the user holds the target) as the target moves in the work area 155. For example, the robotic lawnmower 105 may be configured to "follow" the target as the target moves around the boundary. The "follow-up" action of the robotic lawnmower 105 may help ensure that the target is sufficiently included in the data captured by the robotic lawnmower 105 (e.g., the robotic lawnmower 105 or a plurality of images, a plurality of data samples, etc., captured by an external device 115 that is fixed to the robotic lawnmower 105) such that the captured data allows for a corresponding relative distance between the robotic lawnmower 105 and the target to be determined.
In some examples, the first electronic processor 205 is configured to control operation of the at least one wheel motor 235 to control movement of the robotic lawnmower 105 such that the robotic lawnmower 105 is moved toward the target as the target moves in the work area 155 by: a Received Signal Strength Indication (RSSI) of the signal output by the target is determined and movement of the robotic lawnmower 105 is controlled such that the RSSI of the signal output by the target is equal to or above a predetermined RSSI threshold, as explained previously herein with respect to fig. 5.
In some examples, the first electronic processor 205 is configured to control operation of the at least one wheel motor 235 to control movement of the robotic lawnmower 105 such that the robotic lawnmower 105 is moved toward the target as the target moves in the work area 155 by: the relative distance of the plurality of relative distances is determined to be greater than or equal to a predetermined threshold and movement of the robotic lawnmower 105 is controlled to move toward the target until the relative distance between the robotic lawnmower 105 and the target decreases below the predetermined threshold. In some examples, the predetermined threshold may be preprogrammed based on a maximum detection range/viewing distance of the millimeter wave radar device, the cameras 250, 330, or another device configured to capture data for determining the relative distance between the robotic lawnmower 105 and the target. For example, the predetermined distance may be preprogrammed to 60%, 70%, 80%, etc. of the maximum detection range/viewing distance of the millimeter wave radar device, the cameras 250, 330, or another device to attempt to move the robotic lawnmower 105 to ensure that the target is always within the maximum detection range/viewing distance. In some examples, in response to the relative distance being greater than or equal to a predetermined threshold, the first electronic processor 205 controls the robotic lawnmower 105 to move in a direction toward the target based on the captured data indicative of the target's direction relative to the robotic lawnmower 105 (e.g., using image analysis techniques, millimeter wave radar analysis techniques, etc.).
In some examples, the first electronic processor 205 is configured to control operation of the at least one wheel motor 235 to control movement of the robotic lawnmower 105 such that the robotic lawnmower 105 moves toward the target by receiving commands from the target (e.g., when the target is a device carried by the user 605) as the target moves in the work area 155. The command may include instructions on how to control the robotic lawnmower 105 to move the robotic lawnmower 105 towards the target. Controlling robotic lawnmower 105 in this manner has been previously explained herein with respect to fig. 5.
At block 710, the first electronic processor 205 determines one or more locations of the robotic garden tool 105 as the target moves in the work area 155 (and as data that allows for determining a plurality of relative distances is captured). In some instances (e.g., when camera 250 of robotic lawnmower 105 is a 360 degree camera and work area 155 is relatively small), robotic lawnmower 105 may not move during execution of blocks 705 and 710. In other words, the robotic lawnmower 105 may not perform a "follow-up" action because the robotic lawnmower 105 may be able to capture data about the target for determining multiple relative distances without moving (e.g., the target remains within the field of view and the maximum detection range/viewing distance in order to perform accurate analysis on the captured data). In such an instance, the one or more locations may include a single location of robotic lawnmower 105 as the target moves around the boundary during execution of blocks 705 and 710. On the other hand, in many instances, as the target moves around the boundary, the robotic lawnmower 105 performs a "follow-up" to ensure that useful data is captured to allow multiple relative distances to be determined based on the captured data. In such an example, the one or more locations include a plurality of locations of the robotic lawnmower 105 as the target moves around the boundary during execution of blocks 705 and 710.
In some examples, the first electronic processor 205 is configured to determine one or more positions of the robotic lawnmower 105 (at block 710) in a similar manner as described above with respect to block 505 of fig. 5. For example, the robotic lawnmower 105 uses the position signals received by its RTK GNSS receiver from the one or more satellites 150 and calibration information received from the stationary base station device 145 regarding the position signals received by its RTKGNSS receiver to determine the current position of the robotic lawnmower 105. As another example, robotic lawnmower 105 may determine its location based solely on position signals received from one or more satellites 150.
As explained above with respect to block 505 of fig. 5, in some embodiments, the first electronic processor 205 may determine/track one or more locations of the robotic lawnmower 105 as the robotic lawnmower 105 moves in the work area 155 during creation of the virtual boundary 810. In some embodiments, the first electronic processor 205 may determine the current location of the robotic lawnmower 105 continuously or periodically (at predetermined time intervals, such as every 100 milliseconds, every 500 milliseconds, every second, etc.) and time stamp each of the one or more locations of the robotic lawnmower 105 with a respective time corresponding to the time the location was determined. One or more locations of robotic lawnmower 105 and corresponding timestamps may be saved in first memory 210 by first electronic processor 205 and/or may be transferred to another device (e.g., server 152, external device 115, etc.) for storage and/or use.
In some examples, the first electronic processor 205 may be configured to determine the current location of the robotic lawnmower 105 at approximately the same time when corresponding data that allows for determining the relative distance to the target is captured. In other words, data relating to the relative distance between the robotic lawnmower 105 and the target may be captured at approximately the same time (e.g., starting at the same time and according to the same periodic time interval) when determining the respective current position of the robotic lawnmower 105.
In some instances where the external device 115 is secured to the robotic lawnmower 105 and configured to capture images for determining multiple relative distances, the robotic lawnmower 105 may still determine its own current location or locations at block 710 because the GPS receiver (e.g., the RTKGNSS receiver) may be more accurate than the GPS receiver of the external device 115.
Blocks 705 and 710 may be repeated as indicated by the dashed line returning from block 710 to block 705 in fig. 7 until user 605 has captured data around the boundary of job area 155 as desired. In some embodiments, the first electronic processor 205 receives user input from the user 605 via the first input device 220 indicating that all of the desired captured data has been captured (e.g., indicating that the user 605 has moved around the entire closed boundary). Such user input may additionally or alternatively be received by the second input device 320 of the external device 115 and transmitted to the robotic lawnmower 105. In some embodiments, the first electronic processor 205 may be configured to determine the starting point of the user 605 based on the respective positions of the robotic lawnmowers 105 and the respective relative distances between the robotic lawnmowers 105 and the target when the user 605 begins to move along the boundary. In some examples, the first electronic processor 205 may determine that all desired data has been captured in response to determining that the target has moved back to near the origin (e.g., closure of the boundary loop is detected) after moving about 360 degrees around the robotic lawnmower 105.
Once all desired data is captured (or while the data is captured), at block 715, the first electronic processor 205 determines a respective one of the one or more locations of the robotic garden tool 105 at a respective time that allows for determining that data for each of the plurality of relative distances is captured. For example, using each piece of captured data and a respective timestamp for each determined location of robotic lawnmower 105, first electronic processor 205 may associate each relative distance with a respective location of robotic lawnmower 105 based on the two pieces of information being determined using the simultaneously captured/received data.
At block 720, the first electronic processor 205 generates a virtual boundary 810 using each of the plurality of relative distances in combination with a respective location of the robotic garden tool 105 at a respective time that allows data determining each of the plurality of relative distances to be captured. For example, the first electronic processor 205 may be configured to determine a plurality of waypoints. Each of the plurality of waypoints may correspond to a respective location of the object as the object moves around the boundary. The first electronic processor 205 may determine the location of each waypoint using: (i) a respective relative distance of the plurality of relative distances; and (ii) respective locations of the robotic garden tool 105 at respective times that allow data determining respective ones of the plurality of relative distances to be captured. For example, using the position of robotic lawnmower 105 determined at a given time and the relative distance between robotic lawnmower 105 and the target as determined based on the captured data corresponding to the given time, the absolute position (e.g., waypoint) of the target at the given time may be determined. In some examples, the first electronic processor 205 is configured to generate the virtual boundary 810 using the waypoints in the same or similar manner as previously described herein with respect to block 525 of fig. 5.
At block 725, the first electronic processor 205 may control the robotic garden tool 105 to be constrained by the virtual boundary 810 to remain in the work area 155 during operation of the robotic garden tool 105, as previously explained herein.
Although blocks 705, 710, 715, and 720 are primarily described above as being performed by the first electronic processor 205, in some examples, the server 152, electronic processors 205, 305, 405 of any device, or a combination thereof may perform one or a combination of blocks 705, 710, 715, and 720. In some instances and in a similar manner as previously described herein with respect to fig. 5, information may be shared between devices of communication system 100 to allow different devices to perform the steps of method 700, such as determining one or more of a plurality of absolute positions of a target as the target moves around a boundary during a virtual boundary creation process. For example, the robotic lawnmower 105 may be configured to transmit the plurality of relative distances (or captured raw data allowing the plurality of relative distances to be determined) to a remote device (e.g., the external device 115, the server 152, etc.), one or more locations of the robotic lawnmower 105 at respective times when each of the plurality of relative distances is determined, and a respective timestamp for each of the plurality of relative distances and the one or more locations of the robotic lawnmower 105. The remote device may be configured to generate the virtual boundary 810 (at block 720) using the plurality of relative distances, the one or more locations of the robotic lawnmower 105, and the respective time stamps for each of the plurality of relative distances and the one or more locations of the robotic lawnmower 105 in a same or similar manner as previously described herein with respect to block 525 of fig. 5. The robotic lawnmower 105 may be configured to receive the virtual boundary 810 from the remote device in the same or similar manner as previously described herein with respect to block 525 of fig. 5.
In some examples, base station device 145 may perform at least some of blocks 705, 710, 715, and 720 in place of robotic lawnmower 105. For example, base station device 145 may be placed near the center of work area 155 and may include a 360 degree camera to capture data indicating a relative distance to the target as the target moves around the boundary. In such embodiments, robotic lawnmower 105 may be used as a stationary base station to provide positional calibration information to base station device 145. Additionally or alternatively, the second base station device 145 may act as a stationary base station to provide location calibration information to the base station device 145 near the center of the work area 155. In instances where two base station devices 145 are provided, a fixed device 905 may be used to secure one of the base station devices 145 to the robotic lawnmower 105 and may perform a similar function as the fixed external device 115 (e.g., a data capture device such as a camera or millimeter wave radar device may be used to capture data indicative of the relative distance to the target as the target moves around the boundary). As explained previously herein with respect to similar examples, when the base station device 145 is secured to the robotic lawnmower 105, the robotic lawnmower 105 may perform a "follow" action to allow the data capture device of the base station device 145 to continue to capture data about the target for determining the plurality of relative distances as the target moves around the boundary.
In any of the above embodiments with respect to fig. 7, the absolute position of the target as the target moves around the boundary during the virtual boundary creation process may be determined by the robotic lawnmower 105 or a device (e.g., external device 115) fixed to the robotic lawnmower 105, rather than by a device carried by the user 605 around the boundary. In embodiments in which the external device 115 is fixed to the robotic lawnmower 105, the second electronic processor 305 of the external device 115 may perform at least some of the determinations described above as being performed by the first electronic processor 205. The data collected by each device 105, 115 and the calculations performed by each electronic processor 205, 305 may be shared between the two devices 105, 115 to facilitate the generation of the virtual boundary 810.
In the same or similar manner as previously described herein with respect to fig. 5, in some embodiments, method 700 may be repeated to generate more than one virtual boundary and/or modify an existing virtual boundary.
Although the description of fig. 7 and 8 above primarily relates to using robotic lawnmower 105 (and/or external device 115 secured to robotic lawnmower 105) to capture data for determining a path point for generating a virtual boundary by tracking movement of an object of interest (e.g., user 605) as it moves around the perimeter of work area 155, in some examples, another device in system 100 (i) captures data of the object of interest to allow determination of the path point, and/or (ii) holds external device 115 to allow external device 115 to capture data of the object of interest. For example, the docking station 110 may include a camera or may include a fixture (e.g., similar to the fixture 1205) to hold the external device 115. In such an instance, the docking station 110 may be placed at a corner of the work area 155, and the camera may be a wide angle camera or a 360 degree camera configured to capture images in a wide field of view to enable capture of the target of interest (e.g., user 605) as the target of interest moves around the perimeter of the work area 155. In some examples, the external device 115 may instead be mounted to a stake in the ground or other object configured to hold the external device 115. In some examples, the external device 115 may be held in a stationary position by a first user and panned/turned to face a second user acting as a target of interest and moving around the perimeter of the work area 155. In instances where the external device 115 includes an RTK GNSS receiver, the absolute position of the external device 115 may be determined by the external device 115 itself. In instances where the external device does not include an RTK GNSS receiver, the absolute position of the external device 115 may be determined by placing (at least temporarily) the device (e.g., robotic lawnmower 105, base station device 145, etc.) with the RTK GNSS receiver at the location of the external device 115 (e.g., at the docking station 110, at the location where the external device 115 is installed or held by the first user, etc.). As indicated by the above examples, in some examples of the method of fig. 7, robotic lawnmower 105 may not capture data of the object of interest to allow determination of the waypoint. In contrast, the external device 115 may not be attached to the robotic lawnmower 105, but may capture such data in a stationary position, for example. In the alternative example just described above, the method 700 may be performed in other ways, in addition to the differences described above, in a manner similar to that previously described herein.
In some examples, external device 115 is configured to generate data that may be used to generate virtual boundaries without capturing data about the target of interest as the target moves around the perimeter of work area 155. For example, the user 605 may use the external device 115 to capture data (e.g., video, images, etc.) of the perimeter of the work area 155. For example, the user 605 may stand at a corner of the work area 155 and pan/rotate the external device 115 to capture image/video data corresponding to a desired virtual boundary. The external device 115 may then display the captured image/video data to allow the user 605 to select a path point and/or draw a boundary line on the displayed image/video. Using image/video analysis, the external device 115 (or another device that receives image/video data and user input data from the external device 115) may determine the location of the user input (e.g., the selected waypoint and/or drawn boundary line) relative to the field of view/perspective of the external device 115 capturing the image/video data. Then, based on the absolute position of the external device 115, the external device 115 (or another device that receives image/video data and user input data from the external device 115) may determine the position/coordinates of the virtual boundary in a similar manner as previously described herein. As indicated with respect to other examples explained herein, in instances where the external device 115 includes an RTK GNSS receiver, the absolute position of the external device 115 may be determined by the external device 115 itself. In instances where the external device 115 does not include an RTK GNSS receiver, the absolute position of the external device 115 may be determined by placing (at least temporarily) the device (e.g., robotic lawnmower 105, base station device 145, etc.) with the RTK GNSS receiver at the position of the external device 115 (e.g., at the position where the external device 115 is held and panned/turned by the user 605).
The embodiments described above and illustrated in the figures are presented by way of example only and are not intended as a limitation upon the concepts and principles of the present invention. It will thus be appreciated that various changes in the elements and their configuration and arrangement are possible without departing from the spirit and scope of the present invention.

Claims (20)

1. A communication system, comprising:
robotic garden tool, comprising
A housing;
a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in a work area;
at least one wheel motor coupled to one or more wheels of the set of wheels, the at least one wheel motor configured to drive rotation of the one or more wheels; and
an electronic processor configured to determine a plurality of relative distances between the robotic garden tool and the target as the target moves in the work area,
one or more positions of the robotic garden tool are determined as the target moves in the work area,
determining a respective position of the one or more positions of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured, wherein a virtual boundary is generated using each of the plurality of relative distances in combination with the respective position of the robotic garden tool at the respective time that allows data determining each of the plurality of relative distances to be captured, and
The robotic garden tool is controlled to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
2. The communication system of claim 1, wherein the electronic processor is configured to determine a plurality of waypoints, wherein each waypoint of the plurality of waypoints is based on a respective relative distance of the plurality of relative distances and a respective location of the robotic garden tool at a respective time that allows data determining the respective relative distance of the plurality of relative distances to be captured; and is also provided with
Wherein the electronic processor is configured to generate the virtual boundary using the waypoints.
3. The communication system of claim 1, further comprising a network interface configured to allow the electronic processor to communicate with a base station device, wherein the base station device is configured to receive a position signal from a satellite and transmit calibration information regarding the position signal to the robotic garden tool; and is also provided with
Wherein the electronic processor is configured to receive the position signal from the satellite,
receiving the calibration information from the base station device, and
determining one or more locations of the robotic garden tool based on: (i) the position signal; and (ii) the calibration information.
4. The communication system of claim 3, wherein the electronic processor is configured to receive the location signal via a first real-time kinematic global navigation satellite system (RTK GNSS) receiver of the robotic garden tool;
wherein the electronic processor is configured to receive the calibration information via a first radio frequency transceiver of the robotic garden tool;
wherein the base station apparatus is configured to receive the position signal via a second RTK GNSS receiver of the base station apparatus; and is also provided with
Wherein the base station device is configured to transmit the calibration information via a second radio frequency transceiver of the base station device.
5. The communication system of claim 1, wherein the electronic processor is configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves toward the target as the target moves in the work area.
6. The communication system of claim 5, wherein the electronic processor is configured to control operation of the at least one wheel motor to control movement of the robotic garden tool such that the robotic garden tool moves toward the target as the target moves in the work area, by at least one of:
Determining a Received Signal Strength Indication (RSSI) of a signal output by the target, and controlling movement of the robotic garden tool such that the RSSI of the signal output by the target is equal to or higher than a predetermined RSSI threshold;
determining that a relative distance of the plurality of relative distances is greater than or equal to a predetermined threshold, and controlling movement of the robotic garden tool to move towards the target until the relative distance between the robotic garden tool and the target decreases below the predetermined threshold; and
a command is received from the target, wherein the command includes instructions on how to control the robotic garden tool to move the robotic garden tool towards the target.
7. The communication system of claim 1, wherein the electronic processor is configured to:
each of the plurality of relative distances is time stamped with a respective time corresponding to a time at which data that allows the determination of the relative distance was captured;
time stamping each of the one or more locations with a second respective time corresponding to the time at which the location was determined;
transmitting the plurality of relative distances, the one or more locations, and respective timestamps for each of the plurality of relative distances and the one or more locations to a remote device, wherein the remote device is configured to generate the virtual boundary using the plurality of relative distances, the one or more locations, and the respective timestamps for each of the plurality of relative distances and the one or more locations; and
The virtual boundary is received from the remote device.
8. The communication system of claim 1, wherein the electronic processor is configured to:
receiving a plurality of images captured by a camera as the target moves in the work area, wherein each image of the plurality of images includes the target; and
each of the plurality of relative distances is determined based on the position and orientation of the object in a respective image of the plurality of images.
9. The communication system of claim 8, wherein the camera is integrated into a housing of the robotic garden tool.
10. The communication system of claim 8, wherein the camera is integrated into an external device, and wherein the robotic garden tool comprises a securing device for securing the external device to the robotic garden tool; and is also provided with
Wherein the robotic garden tool is configured to receive the plurality of images from the external device.
11. The communication system of claim 8, wherein the target comprises a human user, and wherein the electronic processor is configured to identify the target within each of the plurality of images using an image analysis technique based on an expected shape of the human user.
12. The communication system of claim 8, wherein the target comprises a fiducial marker, and wherein the electronic processor is configured to identify the target within each of the plurality of images using an image analysis technique based on an expected design of the fiducial marker.
13. The communication system of claim 1, further comprising a millimeter wave radar device, wherein the electronic processor is configured to:
receiving a plurality of data samples captured by the millimeter wave radar device as the target moves in the work area, wherein each of the plurality of data samples includes data indicative of a respective location of the target; and
each of the plurality of relative distances is determined based on a respective position of the target in each of the plurality of data samples.
14. The communication system of claim 13, wherein the target comprises a human user, and wherein the electronic processor, the millimeter wave radar device, or both the electronic processor and the millimeter wave radar device are configured to identify the target within each of the plurality of data samples based on an expected shape of the human user.
15. The communication system of claim 1, further comprising a server device configured to receive the plurality of relative distances and respective locations of the robotic garden tool at respective times that allow data determining each of the plurality of relative distances to be captured;
wherein the server device is configured to generate the virtual boundary using each of the plurality of relative distances in combination with a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured.
16. A method of creating a virtual boundary, the method comprising:
determining, with an electronic processor of a robotic garden tool, a plurality of relative distances between the robotic garden tool and the target as the target moves in the work area, the robotic garden tool comprising
A housing;
a set of wheels coupled to the housing and configured to rotate to propel the robotic garden tool over a work surface in the work area; and
at least one wheel motor coupled to one or more wheels of the set of wheels, the at least one wheel motor configured to drive rotation of the one or more wheels;
Determining, with the electronic processor, one or more positions of the robotic garden tool as the target moves in the work area;
determining with the electronic processor a respective location of the one or more locations of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured,
generating the virtual boundary using each of the plurality of relative distances in combination with a respective location of the robotic garden tool at a respective time that allows data determining each of the plurality of relative distances to be captured; and
the robotic garden tool is controlled by the electronic processor to be constrained by the virtual boundary to remain within the work area during operation of the robotic garden tool.
17. The method of claim 16, further comprising:
receiving, with the electronic processor, a plurality of images captured by a camera as the target moves in the work area, wherein each image of the plurality of images includes the target; and
determining, with the electronic processor, each of the plurality of relative distances based on: (i) The position of the target in a respective image of the plurality of images; and (ii) a respective location of the robotic garden tool at a respective time of capturing the respective image.
18. The method of claim 17, wherein the target comprises a fiducial marker, and the method further comprises identifying, with the electronic processor, the target within each of the plurality of images using an image analysis technique based on an expected design of the fiducial marker.
19. The method of claim 16, further comprising:
receiving, with the electronic processor, a plurality of data samples captured by a millimeter wave radar device as the target moves in the work area, wherein each of the plurality of data samples includes data indicative of a respective location of the target; and
determining, with the electronic processor, each of the plurality of relative distances based on: (i) A respective location of the target in each of the plurality of data samples; and (ii) a respective location of the robotic garden tool at a respective time of capturing each data sample, wherein the target comprises a human user; and
the target is identified within each of the plurality of data samples based on the expected shape of the human user using the electronic processor, the millimeter wave radar device, or both the electronic processor and the millimeter wave radar device.
20. The method of claim 16, wherein generating the virtual boundary comprises generating the virtual boundary with a server device remote from the robotic garden tool.
CN202310444612.9A 2022-04-28 2023-04-23 Creating virtual boundaries for robotic garden tools Pending CN116974275A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/335,944 2022-04-28
US202263370628P 2022-08-05 2022-08-05
US63/370,628 2022-08-05

Publications (1)

Publication Number Publication Date
CN116974275A true CN116974275A (en) 2023-10-31

Family

ID=88482105

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202310444612.9A Pending CN116974275A (en) 2022-04-28 2023-04-23 Creating virtual boundaries for robotic garden tools
CN202310789641.9A Pending CN117311340A (en) 2022-06-29 2023-06-29 Controlling movement of robotic garden tool relative to one or more detected objects
CN202310798029.8A Pending CN117311341A (en) 2022-06-29 2023-06-29 Controlling movement of robotic garden tool relative to one or more detected targets

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202310789641.9A Pending CN117311340A (en) 2022-06-29 2023-06-29 Controlling movement of robotic garden tool relative to one or more detected objects
CN202310798029.8A Pending CN117311341A (en) 2022-06-29 2023-06-29 Controlling movement of robotic garden tool relative to one or more detected targets

Country Status (1)

Country Link
CN (3) CN116974275A (en)

Also Published As

Publication number Publication date
CN117311341A (en) 2023-12-29
CN117311340A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
US11845189B2 (en) Domestic robotic system and method
CN106662452B (en) Map construction for mowing robot
US9603300B2 (en) Autonomous gardening vehicle with camera
EP3234717B1 (en) Robot vehicle parcel navigation following a minimum workload path.
CN109874487A (en) A kind of autonomous type grass trimmer and its navigation system
US20210364632A1 (en) Methods and Systems for Map Creation and Calibration of Localization Equipment in an Outdoor Environment
Einecke et al. Boundary wire mapping on autonomous lawn mowers
EP4270138A1 (en) Creation of a virtual boundary for a robotic garden tool
CN116974275A (en) Creating virtual boundaries for robotic garden tools
EP3761136B1 (en) Control device, mobile body, and program
EP4375710A1 (en) Determining a location to place a base station device used by a robotic garden tool
EP4332716A2 (en) Mapping objects encountered by a robotic garden tool
EP4270137A1 (en) Creation of a virtual boundary for a robotic garden tool
US20240000018A1 (en) Controlling movement of a robotic garden tool with respect to one or more detected objects
EP4332711A1 (en) Creation of a virtual boundary for a robotic garden tool
EP4356709A1 (en) Liquid detection sensor for a robotic garden tool
EP4295660A1 (en) Controlling movement of a robotic garden tool for docking purposes
CN116508478A (en) Robot gardening tool

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication