WO2016097897A1 - Robotic patrol vehicle - Google Patents

Robotic patrol vehicle Download PDF

Info

Publication number
WO2016097897A1
WO2016097897A1 PCT/IB2015/058669 IB2015058669W WO2016097897A1 WO 2016097897 A1 WO2016097897 A1 WO 2016097897A1 IB 2015058669 W IB2015058669 W IB 2015058669W WO 2016097897 A1 WO2016097897 A1 WO 2016097897A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic vehicle
monitoring
trigger event
instructions
robotic
Prior art date
Application number
PCT/IB2015/058669
Other languages
French (fr)
Inventor
Magnus ÖHRLUND
Stefan GRUFMAN
Björn MANNEFRED
Tom SÖBERG
Original Assignee
Husqvarna Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Husqvarna Ab filed Critical Husqvarna Ab
Publication of WO2016097897A1 publication Critical patent/WO2016097897A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3058Monitoring arrangements for monitoring environmental properties or parameters of the computing system or of the computing system component, e.g. monitoring of power, currents, temperature, humidity, position, vibrations
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0265Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using buried wires

Definitions

  • Example embodiments generally relate to robotic vehicles and, more particularly, relate to a robotic vehicle that is configurable to patrol an area and monitor activity within the area.
  • Yard maintenance tasks are commonly performed using various tools and/or machines that are configured for the performance of corresponding specific tasks. Certain tasks, like grass cutting, are typically performed by lawn mowers. Lawn mowers themselves may have many different configurations to support the needs and budgets of consumers. Walk-behind lawn mowers are typically compact, have comparatively small engines and are relatively inexpensive. Meanwhile, at the other end of the spectrum, riding lawn mowers, such as lawn tractors, can be quite large. More recently, robotic mowers and/or remote controlled mowers have also become options for consumers to consider.
  • Robotic mowers are typically confined to operating on a parcel of land that is bounded by some form of boundary wire.
  • the robotic mower is capable of detecting the boundary wire and operating relatively autonomously within the area defined by the boundary wire.
  • the laying of the boundary wire can be a time consuming and difficult task, which operators would prefer to avoid, if possible. That said, to date it has been difficult to try to provide a robotic mower that can truly operate without any need for a boundary wire. Limitations on the accuracy of positioning equipment have played a large role in making this problem difficult to solve.
  • Some example embodiments may therefore provide a robotic vehicle that is configured to incorporate multiple sensors to monitor its environment while on patrol on a particular parcel.
  • the robotic vehicle may be further capable of communicating alarm conditions and/or sharing or storing content generated while on patrol.
  • Some example embodiments may allow robotic vehicle to perform surveillance functions in addition to typical yard maintenance functions.
  • FIG. 1 illustrates an example operating environment for a robotic mower in accordance with an example embodiment
  • FIG. 2 illustrates a block diagram of various components of control circuitry to illustrate some of the components that enable or enhance the functional performance of the robotic mower and to facilitate description of an example embodiment
  • FIG. 3 illustrates a block diagram of some network components that may be employed as part of a communication network for patrolling a parcel in accordance with an example embodiment
  • FIG. 4 illustrates a control flow diagram showing various operations that may be executed in accordance with an example embodiment
  • FIG. 5 illustrates a block diagram of one example of a method of monitoring or patrolling a parcel in accordance with an example embodiment
  • FIG. 6 illustrates an example operating environment for a robotic mower
  • FIG. 7 illustrates a block diagram of various components of control circuitry to illustrate some of the components that enable or enhance the functional performance of the robotic mower and to facilitate description of an example embodiment
  • FIG. 8 illustrates a block diagram of some network components that may be employed as part of a communication network for broadcasting messages in accordance with an example embodiment
  • FIG. 9 illustrates a control flow diagram showing various operations that may be executed in accordance with an example embodiment
  • FIG. 10 illustrates a block diagram of one example of a method of preparing broadcast messages in accordance with an example embodiment.
  • Robotic mowers which are one example of a robotic vehicle of an example embodiment, typically mow an area that is defined by a boundary wire that bounds the area to be mowed. The robotic mower then roams within the bounded area to ensure that the entire area is mowed, but the robotic mower does not go outside of the bounded area. The robotic mower typically mows the area in a relatively continuous fashion, pausing only to recharge batteries or when otherwise directed to pause by the operator. Thus, in some ways, the robotic mower may have a near continuous presence in the yard or garden in which it operates.
  • the robotic vehicle By placing a number of sensors on the robotic vehicle, the robotic vehicle becomes uniquely capable of performing both its typical work function and surveillance functions. In essence, the robotic vehicle can become a yard or garden sentinel that is ever-alert to situations and events transpiring in and around the yard or garden. Moreover, when fitted with a camera, the robotic vehicle is capable of generating content regarding the yard or garden it is operating within, and such content may be useful for detecting certain activities or aiding investigations into the events surrounding certain occurrences in and around the yard or garden.
  • Example embodiments are therefore described herein to provide various structural and control-related design features that can be employed to improve the capabilities of robotic vehicles (e.g., robotic mowers, mobile sensing devices, watering devices and/or the like) with respect to monitoring and/or recording content related to the encounters and experiences associated with the performance of yard maintenance activities.
  • robotic vehicles e.g., robotic mowers, mobile sensing devices, watering devices and/or the like
  • FIG. 1 illustrates an example operating environment for a robotic mower 10 that may be employed in connection with an example embodiment.
  • the robotic mower 10 may operate to cut grass on a parcel 20 (i.e., a land lot, yard, or garden), the boundary 30 of which may be defined using one or more physical boundaries (e.g., a fence, wall, curb and/or the like), or programmed location based boundaries or combinations thereof.
  • the boundary 30 is a detected, by any suitable means, the robotic mower 10 may be informed so that it can operate in a manner that prevents the robotic mower 10 from leaving or moving outside the boundary 30.
  • the boundary 30 could be provided by a wire that is detectable by the robotic mower 10.
  • the robotic mower 10 may be controlled, at least in part, via control circuitry 12 located onboard.
  • the control circuitry 12 may include, among other things, a positioning module and a sensor suite, which will be described in greater detail below. Accordingly, the robotic mower 10 may utilize the control circuitry 12 to define a path for coverage of the parcel 20 in terms of performing a task over specified portions or the entire parcel 20.
  • the positioning module may be used to guide the robotic mower 10 over the parcel 20 and to ensure that full coverage (of at least predetermined portions of the parcel 20) is obtained, while the sensor suite may detect objects and/or gather data regarding the surroundings of the robotic mower 10 while the parcel 20 is traversed.
  • the sensor suite may include a sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, a radar transmitter/detector, an ultrasonic sensor, a laser scanner and/or the like).
  • positional determinations may be made using GPS, inertial navigation, optical flow, radio navigation, visual location (e.g., VSLAM) and/or other positioning techniques or combinations thereof.
  • the sensors may be used, at least in part, for determining the location of the robotic mower 10 relative to boundaries or other points of interest (e.g., a starting point or other key features) of the parcel 20, or determining a position history or track of the robotic mower 10 over time.
  • the sensors may also detect collision, tipping over, or various fault conditions.
  • the sensors may also or alternatively collect data regarding various measurable parameters (e.g., moisture, temperature, soil conditions, etc.) associated with particular locations on the parcel 20.
  • the robotic mower 10 may be battery powered via one or more rechargeable batteries. Accordingly, the robotic mower 10 may be configured to return to a charge station 40 that may be located at some position on the parcel 20 in order to recharge the batteries.
  • the batteries may power a drive system and a blade control system of the robotic mower 10.
  • the control circuitry 12 of the robotic mower 10 may selectively control the application of power or other control signals to the drive system and/or the blade control system to direct the operation of the drive system and/or blade control system. Accordingly, movement of the robotic mower 10 over the parcel 20 may be controlled by the control circuitry 12 in a manner that enables the robotic mower 10 to systematically traverse the parcel while operating a cutting blade to cut the grass on the parcel 20.
  • the control circuitry 12 may be configured to control another functional or working assembly that may replace the blade control system and blades.
  • control circuitry 12 and/or a communication node at the charge station 40 may be configured to communicate wirelessly with an electronic device 42 (e.g., a personal computer, a cloud based computer, server, mobile telephone, PDA, tablet, smart phone, and/or the like) of a remote operator 44 (or user) via wireless links 46 associated with a wireless communication network 48.
  • the wireless communication network 48 may provide operable coupling between the remote operator 44 and the robotic mower 10 via the electronic device 42, which may act as a remote control device for the robotic mower 10 or may receive data indicative or related to the operation of the robotic mower 10.
  • the wireless communication network 48 may include additional or internal components that facilitate the communication links and protocols employed.
  • some portions of the wireless communication network 48 may employ additional components and connections that may be wired and/or wireless.
  • the charge station 40 may have a wired connection to a computer or server that is connected to the wireless communication network 48, which may then wirelessly connect to the electronic device 42.
  • the robotic mower 10 may wirelessly connect to the wireless communication network 48 (directly or indirectly) and a wired connection may be established between one or more servers of the wireless communication network 48 and a PC of the remote operator 44.
  • the wireless communication network 48 may be a data network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN) (e.g., the Internet), and/or the like, which may couple the robotic mower 10 to devices such as processing elements (e.g., personal computers, server computers or the like) or databases. Accordingly, communication between the wireless communication network 48 and the devices or databases (e.g., servers, electronic device 42, control circuitry 12) may be accomplished by either wireline or wireless communication mechanisms and corresponding protocols.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • FIG. 2 illustrates a block diagram of various components of the control circuitry 12 to illustrate some of the components that enable or enhance the functional performance of the robotic mower 10 and to facilitate description of an example embodiment.
  • the control circuitry 12 may include or otherwise be in communication with a vehicle positioning module 60, a camera 70, and a monitoring module 80.
  • the vehicle positioning module 60, the camera 70, and the monitoring module 80 may work together to give the robotic mower 10 a comprehensive understanding of its environment, and enable it to be operated autonomously without boundary wires (or within such wires where they are employed).
  • any or all of the vehicle positioning module 60, the camera 70, and the monitoring module 80 may be part of a sensor network 90 of the robotic mower 10. However, in some cases, any or all of the vehicle positioning module 60, the camera 70, and the monitoring module 80 may be separate from but otherwise in communication with the sensor network 90 to facilitate operation of each respective module.
  • the camera 70 may include an electronic image sensor configured to store captured image data (e.g., in memory 114). Image data recorded by the camera 70 may be in the visible light spectrum or in other portions of the electromagnetic spectrum (e.g., IR camera). In some cases, the camera 70 may actually include multiple sensors configured to capture data in different types of images (e.g., RGB and IR sensors). The camera 70 may be configured to capture still images and/or video data.
  • the robotic mower 10 may also include one or more functional components 100 that may be controlled by the control circuitry 12 or otherwise be operated in connection with the operation of the robotic mower 10.
  • the functional components 100 may include a wheel assembly (or other mobility assembly components), one or more cutting blades and corresponding blade control components, and/or other such devices.
  • the functional components 100 may include equipment for performing various lawn care functions such as, for example, taking soil samples, operating valves, distributing water, seed, powder, pellets or chemicals, and/or other functional devices and/or components.
  • the control circuitry 12 may include processing circuitry 110 that may be configured to perform data processing or control function execution and/or other processing and management services according to an example embodiment of the present invention.
  • the processing circuitry 110 may be embodied as a chip or chip set.
  • the processing circuitry 110 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the processing circuitry 110 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processing circuitry 110 may include one or more instances of a processor 112 and memory 114 that may be in communication with or otherwise control a device interface 120 and, in some cases, a user interface 130.
  • the processing circuitry 110 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein.
  • the processing circuitry 110 may be embodied as a portion of an on-board computer.
  • the processing circuitry 110 may communicate with electronic components and/or sensors of the robotic mower 10 via a single data bus. As such, the data bus may connect to a plurality or all of the switching components, sensory components and/or other electrically controlled components of the robotic mower 10.
  • the processor 112 may be embodied in a number of different ways.
  • the processor 112 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like.
  • the processor 112 may be configured to execute instructions stored in the memory 114 or otherwise accessible to the processor 112.
  • the processor 112 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 110) capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 112 when the processor 112 is embodied as an ASIC, FPGA or the like, the processor 112 may be specifically configured hardware for conducting the operations described herein.
  • the processor 112 when the processor 112 is embodied as an executor of software instructions, the instructions may specifically configure the processor 112 to perform the operations described herein.
  • the processor 112 may be embodied as, include or otherwise control the vehicle positioning module 60, the camera 70, and the monitoring module 80.
  • the processor 112 may be said to cause each of the operations described in connection with the vehicle positioning module 60, the camera 70, and the monitoring module 80 by directing the vehicle positioning module 60, the camera 70, and the monitoring module 80, respectively, to undertake the corresponding functionalities responsive to execution of instructions or algorithms configuring the processor 112 (or processing circuitry 110) accordingly.
  • These instructions or algorithms may configure the processing circuitry 110, and thereby also the robotic mower 10, into a tool for driving the corresponding physical components for performing corresponding functions in the physical world in accordance with the instructions provided.
  • the memory 114 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable.
  • the memory 114 may be configured to store information, data, applications, instructions or the like for enabling the vehicle positioning module 60 and the camera 70 to carry out various functions in accordance with exemplary embodiments of the present invention.
  • the memory 114 could be configured to buffer input data for processing by the processor 112.
  • the memory 114 could be configured to store instructions for execution by the processor 112.
  • the memory 114 may include one or more databases that may store a variety of data sets responsive to input from various sensors or components of the robotic mower 10.
  • applications may be stored for execution by the processor 112 in order to carry out the functionality associated with each respective application.
  • the applications may include applications for controlling the robotic mower 10 relative to various operations including determining an accurate position of the robotic mower 10 (e.g., using one or more sensors of the vehicle positioning module 60). Alternatively or additionally, the applications may include applications for controlling the robotic mower 10 relative to various operations including determining the existence and/or position of obstacles (e.g., static or dynamic) and borders relative to which the robotic mower 10 must navigate. Alternatively or additionally, the applications may include applications for controlling the robotic mower 10 relative to various operations to be executed on the parcel 20. Alternatively or additionally, the applications may include applications for controlling the camera 70 and/or processing image data gathered by the camera 70 to execute or facilitate execution of other applications that drive or enhance operation of the robotic mower 10 relative to various activities described herein.
  • the applications may include instructions for gathering images/video or other visual content gathered by the sensor network 90 (and/or the camera 70) for analysis (e.g., by the monitoring module 80) in connection with the performance of monitoring functions described herein.
  • the applications may include instructions for analyzing visual content for the generation of alarms, messages, and/or the like (e.g., via a network such as the wireless communication network 48).
  • the user interface 130 may be in communication with the processing circuitry 110 to receive an indication of a user input at the user interface 130 and/or to provide an audible, visual, mechanical or other output to the user.
  • the user interface 130 may include, for example, a display, one or more buttons or keys (e.g., function buttons), and/or other input/output mechanisms (e.g., microphone, speakers, cursor, joystick, lights and/or the like).
  • the device interface 120 may include one or more interface mechanisms for enabling communication with other devices either locally or remotely.
  • the device interface 120 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to sensors or other components in communication with the processing circuitry 110.
  • the device interface 120 may provide interfaces for communication of data to/from the control circuitry 12, the vehicle positioning module 60, the camera 70, and the monitoring module 80, the sensor network 90, and/or other functional components 100 via wired or wireless communication interfaces in a real-time manner, as a data package downloaded after data gathering or in one or more burst transmission of any kind.
  • Each of the vehicle positioning module 60, the camera 70, and the monitoring module is configured to control the vehicle positioning module 60, the camera 70, and the monitoring module.
  • the modules 80 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to perform the corresponding functions described herein.
  • the modules may include hardware and/or instructions for execution on hardware (e.g., embedded processing circuitry) that is part of the control circuitry 12 of the robotic mower 10.
  • the modules may share some parts of the hardware and/or instructions that form each module, or they may be distinctly formed. As such, the modules and components thereof are not necessarily intended to be mutually exclusive relative to each other from a compositional perspective.
  • the vehicle positioning module 60 may be configured to utilize one or more sensors (e.g., of the sensor network 90) to determine a location of the robotic mower 10 and direct continued motion of the robotic mower 10 to achieve appropriate coverage of the parcel 20.
  • the robotic mower 10 (or more specifically, the control circuitry 12) may use the location information to determine a mower track and/or provide full coverage of the parcel 20 to ensure the entire parcel is mowed (or otherwise serviced).
  • the vehicle positioning module 60 may therefore be configured to direct movement of the robotic mower 10, including the speed and direction of the robotic mower 10.
  • the vehicle positioning module 60 may also employ such sensors to attempt to determine an accurate current location of the robotic mower 10 on the parcel 20 (or generally).
  • Various sensors of sensor network 90 of the robotic mower 10 may be included as a portion of, or otherwise communicate with, the vehicle positioning module 60 to, for example, determine vehicle speed/direction, vehicle location, vehicle orientation and/or the like. Sensors may also be used to determine motor run time, machine work time, and other operational parameters.
  • positioning and/or orientation sensors e.g., global positioning system (GPS) receiver and/or accelerometer
  • GPS global positioning system
  • the sensor network 90 may provide data to the modules described above to facilitate execution of the functions described above, and/or any other functions that the modules may be configurable to perform.
  • the sensor network 90 may include (perhaps among other things) an inertial measurement unit (IMU) 150 and a GPS receiver 152.
  • IMU inertial measurement unit
  • Other possible sensors may include (but are not limited to) a temperature sensor, an object detector (e.g., time-of-flight ranging devices), a humidity sensor, a barometer, a rain gauge, a grass sensor, and/or the like.
  • the sensor network 90 may include independent devices with on-board processing that communicate with the processing circuitry 110 of the control circuitry 12 via a single data bus, or via individual communication ports.
  • one or more of the devices of the sensor network 90 may rely on the processing power of the processing circuitry 110 of the control circuitry 12 for the performance of their respective functions.
  • one or more of the sensors of the sensor network 90 (or portions thereof) may be embodied as portions of the positioning module 60, the camera 70 and/or the monitoring module 80.
  • the IMU 150 may include one or more and any or all of combinations of accelerometers, odometers, gyroscopes, magnetometers, compasses, and/or the like. As such, the IMU 150 may be configured to determine velocity, direction, orientation and/or the like so that dead reckoning and/or other inertial navigation determinations can be made by the control circuitry 12. The IMU 150 may be enabled to determine changes in pitch, roll and yaw to further facilitate determining terrain features and/or the like.
  • inertial navigation systems may suffer from integration drift over time. Accordingly, inertial navigation systems may require a periodic position correction, which may be accomplished by getting a position fix from another more accurate method or by fixing a position of the robotic mower 10 relative to a known location. For example, navigation conducted via the IMU 150 may be used for robotic mower 10 operation for a period of time, and then a correction may be inserted when a GPS fix is obtained on robotic mower position. As an example alternative, the IMU 150 determined position may be updated every time the robotic mower 10 returns to the charge station 40 (which may be assumed to be at a fixed location). In still other examples, known reference points may be disposed at one or more locations on the parcel 20 and the robotic mower 10 may get a fix relative to any of such known reference points when the opportunity presents itself. The IMU 150 determined position may then be updated with the more accurate fix information.
  • the GPS receiver 152 may be embodied as a real time kinematic (RTK) - GPS receiver.
  • the GPS receiver 152 may employ satellite based positioning in conjunction with GPS, GLONASS, Galileo, GNSS, and/or the like to enhance accuracy of the GPS receiver 152.
  • carrier-phase enhancement may be employed such that, for example, in addition to the information content of signals received, the phase of the carrier wave may be examined to provide real-time corrections that can enhance accuracy.
  • the monitoring module 80 may be configured to receive position information from the positioning module 60 and image (or video) data from the camera 70. In some cases, the monitoring module 80 may also receive data and/or information from other sensors of the sensor network 90. The monitoring module 80 may be further configured to examine and/or analyze the data/information received, which may generally be referred to as monitoring data for the occurrence of trigger events. The monitoring module 80 may also or alternatively provide for storage of the monitoring data so that the monitoring data can be analyzed and/or retrieved for analysis at a later time, if needed or desired. In some embodiments, the storage of monitoring data may be provided by memory 114. However, remote storage is also possible after communication of the monitoring data to the wireless communication network 48.
  • the monitoring module 80 may be configured in accordance with monitoring instructions.
  • the monitoring instructions may define specific activities, patterns, events and/or the like that qualify as trigger events.
  • the monitoring instructions may be factory pre- set.
  • the operator may change, add or otherwise modify the monitoring instructions either locally at the robotic mower 10 or via a remote interface (e.g., using the electronic device 42).
  • the monitoring module 80 may store images of previously encountered objects or other objects that have been learned or identified as known objects.
  • the camera 70 is able to obtain a new image of the object, the new image can be compared to the stored images to see if a match can be located. If a match is located, the new image may be classified as the known object. However, the detection of an unknown object may qualify as a trigger event. Thus, for example, if no match can be obtained for on object in an image (i.e., an unknown object), the object may be classified as unknown and the failure to locate a match may be a trigger event.
  • detection of dynamic (rather than fixed or static) objects may be a trigger event.
  • the detection may be classified as a trigger event.
  • detection of objects at certain times or via certain sensors may be trigger events.
  • dynamic objects may be classified as or otherwise correlate to living objects, and the detection of living objects may be a trigger event.
  • detection of static objects non-living objects, permanent objects, and/or the like
  • detection and classification of living and non-living objects could also be accomplished using the camera 70 to find matching images.
  • detections associated with certain times and/or certain locations may be trigger events. For example, detecting an object proximate to the house or windows of the house, or by a back door, at night or other specified times may be a trigger event. Still other combinations of detections relating to specific object locations, times, sensors, or other factors may be separately designated as trigger events by the monitoring instructions.
  • a thermal imaging camera may detect living things, fire or overheated objects based on temperature. However, fire and some other significant events could be detected without a thermal camera as well. In any case, if such events are detected, they may be designated as trigger events by the monitoring instructions.
  • the monitoring instructions may not only define trigger events, but may also define the response or responses that are to be initiated in response to a trigger event. Accordingly, the monitoring instructions may define trigger responses to correspond to trigger events, either generally or in specific terms. Thus, for example, a general response such as "initiate alarm” or any other suitable response may be defined for all trigger events. However, in other cases, more targeted responses may be defined for individual trigger events. For example, if an animal is detected that is to be scared away (e.g., a garden pest), then a local audible alarm at the robotic mower 10 may be initiated. Similarly, if someone or something is interfering with robotic mower operation, a local alarm may be generated.
  • a general response such as "initiate alarm” or any other suitable response may be defined for all trigger events.
  • more targeted responses may be defined for individual trigger events. For example, if an animal is detected that is to be scared away (e.g., a garden pest), then a local audible alarm at the robotic mower 10 may be initiated.
  • an intruder or trespasser may be desirable to initiate a local alarm, or to initiate a remote alarm or notification (e.g., to the owner, to the police, or to other interested parties) with or without a local audible alarm.
  • the monitoring module 82 may include an alarm 82.
  • the alarm 82 may be an audible local alarm such as a whistle, beep, siren and/or the like.
  • the alarm 82 may include lights, vibration or other augmentation. If the alarm activation or initiation is instead to be provided remotely, the response to the trigger event may be a notification to trigger a remote alarm at the operator's house or business, at the police or fire station, at an alarm monitoring service, and/or the like.
  • the monitoring module 80 may be configured to identify activity based on monitoring data (e.g., image data, events or activities, and/or the like) about which to generate notifications, messages and/or alarms either locally or for transmission to corresponding defined entities or locations.
  • monitoring data e.g., image data, events or activities, and/or the like
  • the monitoring module 80 may be instantiated at the robotic mower 10 in some cases, alternative embodiments may instantiate the monitoring module 80 in the "cloud.”
  • the robotic mower 10 may be configured with a transceiver or other communication equipment to enable the robotic mower 10 to communicate data related to the environment being monitored (e.g., the parcel 20) either directly or indirectly to the wireless communication network 48.
  • a server 82 of or in communication with the wireless communication network 48 may host the monitoring module 80, which may operate as described above except that the monitoring module 80 is not locally operating at the robotic mower 10. Notifications may then be transmitted from the monitoring module 80 to the electronic device 42 (which may be associated with the operator or with any of a number of first responders or other entities) via the wireless communication network 48.
  • Notifications may be formatted as SMS messages, emails, or other proprietary formatted messages.
  • the notifications may be alarm signals, phone calls, or any other message capable of communicating the corresponding information regarding an event occurrence and, in some cases, content (e.g., image and/or video data) corresponding to the event occurrence (i.e., the event that acted as the trigger event).
  • content e.g., image and/or video data
  • the monitoring module 80 may therefore monitor activity at a plurality of parcels and provide security or monitoring services for each such parcel.
  • Each parcel may have a known address, and therefore, if an alarm or notification is generated, such alarm or notification may be provided in reference to the corresponding parcel (e.g., by address), and/or may be provided to one or more individuals or entities associated (e.g., by registration to a service) with the parcel.
  • the robotic mower 10 may be enabled to facilitate the monitoring of the environment of a robotic mower 10 while engaged in the mowing experience or other yard maintenance activities.
  • Embodiments of the present invention may therefore be practiced using an apparatus such as the one described in reference to FIGS 1-3.
  • some embodiments (or aspects thereof) may be practiced in connection with a computer program product for performing embodiments of the present invention.
  • each block or step of the flowchart of FIGS. 4 and 5, and combinations of blocks in the flowcharts may be implemented by various means, such as hardware, firmware, processor, circuitry and/or another device associated with execution of software including one or more computer program instructions.
  • one or more of the procedures described above may be embodied by computer program instructions, which may embody the procedures described above and may be stored by a storage device (e.g., memory 114) and executed by processing circuitry (e.g., processor 112).
  • processing circuitry e.g., processor 112
  • any such stored computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s) or step(s).
  • These computer program instructions may also be stored in a computer-readable medium comprising memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions to implement the function specified in the flowchart block(s) or step(s).
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block(s) or step(s).
  • FIG. 4 illustrates a control flow diagram of one example of how the robotic mower 10 can be operated in relation to using the sensors thereon to generate content for potential broadcast in accordance with an example embodiment.
  • operation may begin with traversal of the parcel at operation 400.
  • the sensor network may gather data related to the traversal (or mowing) experiences at operation 402.
  • a determination may be made based on the data gathered as to whether any trigger event has occurred at operation 404. If there are no trigger events, then flow may cycle back to the traversal and data gathering operations 400 and 402, which may continue. If a trigger event has occurred at operation 404, then a determination may be made as to whether the trigger event corresponds to an alarm condition at operation 406.
  • a local or remote alarm may be sounded at operation 408, and flow may return to operation 400. If no alarm condition has occurred, then a determination may be made as to whether a notification is required for the trigger event at operation 410. If a notification is required, then a message or other notification may be sent at operation 412 and flow may return to operation 400. If notification is not required, the information associated with the monitoring may be stored at operation 414 and flow may return to operation 400. In some cases, notifications or alarm conditions may not be triggered on one occurrence. Thus, the storage of information at operation 414 may enable later trigger event determinations and/or alarm/notification decisions to be made in light of both current and past events.
  • the processes above may incorporate all of position determining, data gathering and message generation/communication, which can be accomplished based on the inclusion of the sensor network 90 and the modules described above.
  • the robotic mower 10 may generally operate in accordance with a control method that combines the modules described above to provide a functionally robust robotic vehicle.
  • a method according to example embodiments of the invention may include any or all of the operations shown in FIG. 5.
  • other methods derived from the descriptions provided herein may also be performed responsive to execution of steps associated with such methods by a computer programmed to be transformed into a machine specifically configured to perform such methods.
  • a method for monitoring a parcel based on operation of a robotic vehicle may include monitoring data gathered responsive to a robotic vehicle traversing a parcel at operation 500, determining whether the monitored data is indicative of a qualifying event at operation 510, and selectively initiating an alarm or notification function in response to the data indicating occurrence of a trigger event at operation 520.
  • the method may include additional optional operations, an example of which is shown in dashed lines in FIG. 5.
  • the method may further include receiving monitoring instructions defining one or more trigger events at operation 530.
  • the operations 500-530 may also be modified, augmented or amplified in some cases.
  • the qualifying event may be defined based on configuration settings that are operator adjustable.
  • receiving the monitoring instructions may include receiving the monitoring instructions responsive to operator adjustment either locally at the robotic vehicle or via a remote interface.
  • receiving the monitoring instructions may include receiving instructions defining specific activities, patterns, or events that qualify as trigger events.
  • one of the sensors of the sensor network may be a camera.
  • the monitoring module may be configured, responsive to detection of an object in an image captured via the camera, to compare the image to a plurality of stored images to determine if a match for the object can be located.
  • the trigger event may occur based at least on part on whether the match is located.
  • the trigger event may occur responsive to detection of an object at a predefined time or location, or via a predetermined sensor.
  • receiving the monitoring instructions may include receiving instructions defining at least one response to be initiated in response to the trigger event.
  • the monitoring instructions may initiate a local or remote alarm in response to the trigger event.
  • data gathered may also be stored to determine if a repeated number or pattern of events has occurred, which corresponds to the trigger event.
  • an apparatus for performing the method of FIGS. 4 and 5 above may comprise a processor (e.g., the processor 112) configured to perform some or each of the operations (400-530) described above.
  • the processor 112 may, for example, be configured to perform the operations (400-530) by performing hardware implemented logical functions, executing stored instructions, or executing algorithms for performing each of the operations.
  • the apparatus may comprise means for performing each of the operations described above.
  • examples of means for performing operations 400-530 may comprise, for example, the control circuitry 12.
  • the processor 112 may be configured to control or even be embodied as the control circuitry 12, the processor 112 and/or a device or circuitry for executing instructions or executing an algorithm for processing information as described above may also form example means for performing operations 400-530.
  • Some example embodiments may therefore provide a robotic vehicle that is configured to incorporate multiple sensors to experience various aspects of its environment while performing a task on a particular parcel.
  • the robotic vehicle may be further capable of broadcasting information about its experiences.
  • Some example embodiments may improve the ability of operators to interact with the robotic vehicle in a unique and interesting way.
  • Robotic mowers which are one example of a robotic vehicle of an example embodiment, typically mow an area that is defined by a boundary wire that bounds the area to be mowed. The robotic mower then roams within the bounded area to ensure that the entire area is mowed, but the robotic mower does not go outside of the bounded area.
  • Example embodiments are therefore described herein to provide various structural and control-related design features that can be employed to improve the capabilities of robotic vehicles (e.g., robotic mowers, mobile sensing devices, watering devices and/or the like) with respect to broadcasting content related to the encounters and experiences associated with yard maintenance activities.
  • robotic vehicles e.g., robotic mowers, mobile sensing devices, watering devices and/or the like
  • FIG. 6 illustrates an example operating environment for a robotic mower 1010 that may be employed in connection with an example embodiment.
  • the robotic mower 1010 may operate to cut grass on a parcel 1020 (i.e., a land lot, yard, or garden), the boundary 1030 of which may be defined using one or more physical boundaries (e.g., a fence, wall, curb and/or the like), or programmed location based boundaries or combinations thereof.
  • a parcel 1020 i.e., a land lot, yard, or garden
  • the boundary 1030 of which may be defined using one or more physical boundaries (e.g., a fence, wall, curb and/or the like), or programmed location based boundaries or combinations thereof.
  • the robotic mower 1010 may be informed so that it can operate in a manner that prevents the robotic mower 1010 from leaving or moving outside the boundary 1030.
  • the boundary 1030 could be provided by a wire that is detectable by the robotic mower 1010.
  • the robotic mower 1010 may be controlled, at least in part, via control circuitry 1012 located onboard.
  • the control circuitry 1012 may include, among other things, a positioning module and a sensor suite, which will be described in greater detail below. Accordingly, the robotic mower 1010 may utilize the control circuitry 1012 to define a path for coverage of the parcel 1020 in terms of performing a task over specified portions or the entire parcel 1020.
  • the positioning module may be used to guide the robotic mower 1010 over the parcel 1020 and to ensure that full coverage (of at least predetermined portions of the parcel 1020) is obtained, while the sensor suite may detect objects and/or gather data regarding the surroundings of the robotic mower 1010 while the parcel 1020 is traversed.
  • the sensor suite may include a sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, a radar transmitter/detector, an ultrasonic sensor, a laser scanner and/or the like).
  • positional determinations may be made using GPS, inertial navigation, optical flow, radio navigation, visual location (e.g., VSLAM) and/or other positioning techniques or combinations thereof.
  • the sensors may be used, at least in part, for determining the location of the robotic mower 1010 relative to boundaries or other points of interest (e.g., a starting point or other key features) of the parcel 1020, or determining a position history or track of the robotic mower 1010 over time.
  • the sensors may also detect collision, tipping over, or various fault conditions.
  • the sensors may also or alternatively collect data regarding various measurable parameters (e.g., moisture, temperature, soil conditions, etc.) associated with particular locations on the parcel 1020.
  • the robotic mower 1010 may be battery powered via one or more rechargeable batteries. Accordingly, the robotic mower 1010 may be configured to return to a charge station 1040 that may be located at some position on the parcel 1020 in order to recharge the batteries.
  • the batteries may power a drive system and a blade control system of the robotic mower 1010.
  • the control circuitry 1012 of the robotic mower 1010 may selectively control the application of power or other control signals to the drive system and/or the blade control system to direct the operation of the drive system and/or blade control system.
  • movement of the robotic mower 1010 over the parcel 1020 may be controlled by the control circuitry 1012 in a manner that enables the robotic mower 1010 to systematically traverse the parcel while operating a cutting blade to cut the grass on the parcel 1020.
  • the control circuitry 1012 may be configured to control another functional or working assembly that may replace the blade control system and blades.
  • control circuitry 1012 and/or a communication node at the charge station 1040 may be configured to communicate wirelessly with an electronic device 1042 (e.g., a personal computer, a cloud based computer, server, mobile telephone, PDA, tablet, smart phone, and/or the like) of a remote operator 1044 (or user) via wireless links 1046 associated with a wireless communication network 1048.
  • the wireless communication network 1048 may provide operable coupling between the remote operator 1044 and the robotic mower 1010 via the electronic device 1042, which may act as a remote control device for the robotic mower 1010 or may receive data indicative or related to the operation of the robotic mower 1010.
  • the wireless communication network 1048 may include additional or internal components that facilitate the communication links and protocols employed.
  • some portions of the wireless communication network 1048 may employ additional components and connections that may be wired and/or wireless.
  • the charge station 1040 may have a wired connection to a computer or server that is connected to the wireless communication network 1048, which may then wirelessly connect to the electronic device 1042.
  • the robotic mower 1010 may wirelessly connect to the wireless communication network 1048 (directly or indirectly) and a wired connection may be established between one or more servers of the wireless communication network 1048 and a PC of the remote operator 1044.
  • the wireless communication network 1048 may be a data network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN) (e.g., the Internet), and/or the like, which may couple the robotic mower 1010 to devices such as processing elements (e.g., personal computers, server computers or the like) or databases. Accordingly, communication between the wireless communication network 1048 and the devices or databases (e.g., servers, electronic device 1042, control circuitry 1012) may be accomplished by either wireline or wireless communication mechanisms and corresponding protocols.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • FIG. 7 illustrates a block diagram of various components of the control circuitry 1012 to illustrate some of the components that enable or enhance the functional performance of the robotic mower 1010 and to facilitate description of an example embodiment.
  • the control circuitry 1012 may include or otherwise be in communication with a vehicle positioning module 1060, a camera 1070, and a broadcast module 1080.
  • the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may work together to give the robotic mower 1010 a comprehensive understanding of its environment, and enable it to be operated autonomously without boundary wires (or within such wires where they are employed).
  • any or all of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be part of a sensor network 1090 of the robotic mower 1010. However, in some cases, any or all of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be separate from but otherwise in communication with the sensor network 1090 to facilitate operation of each respective module.
  • the camera 1070 may include an electronic image sensor configured to store captured image data (e.g., in memory 1114). Image data recorded by the camera 1070 may be in the visible light spectrum or in other portions of the electromagnetic spectrum (e.g., IR camera). In some cases, the camera 1070 may actually include multiple sensors configured to capture data in different types of images (e.g., RGB and IR sensors). The camera 1070 may be configured to capture still images and/or video data.
  • the robotic mower 1010 may also include one or more functional components 1100 that may be controlled by the control circuitry 1012 or otherwise be operated in connection with the operation of the robotic mower 1010.
  • the functional components 1100 may include a wheel assembly (or other mobility assembly components), one or more cutting blades and corresponding blade control components, and/or other such devices.
  • the functional components 1100 may include equipment for performing various lawn care functions such as, for example, taking soil samples, operating valves, distributing water, seed, powder, pellets or chemicals, and/or other functional devices and/or components.
  • the control circuitry 1012 may include processing circuitry 1110 that may be configured to perform data processing or control function execution and/or other processing and management services according to an example embodiment of the present invention.
  • the processing circuitry 1110 may be embodied as a chip or chip set.
  • the processing circuitry 1110 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the processing circuitry 1110 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processing circuitry 1110 may include one or more instances of a processor 1112 and memory 1114 that may be in communication with or otherwise control a device interface 1120 and, in some cases, a user interface 1130.
  • the processing circuitry 1110 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein.
  • the processing circuitry 1110 may be embodied as a portion of an on-board computer.
  • the processing circuitry 1110 may communicate with electronic components and/or sensors of the robotic mower 1010 via a single data bus.
  • the data bus may connect to a plurality or all of the switching components, sensory components and/or other electrically controlled components of the robotic mower 1010.
  • the processor 1112 may be embodied in a number of different ways.
  • the processor 1112 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like.
  • the processor 1112 may be configured to execute instructions stored in the memory 1114 or otherwise accessible to the processor 1112.
  • the processor 1112 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 1110) capable of performing operations according to embodiments of the present invention while configured accordingly.
  • the processor 1112 when the processor 1112 is embodied as an ASIC, FPGA or the like, the processor 1112 may be specifically configured hardware for conducting the operations described herein.
  • the processor 1112 when the processor 1112 is embodied as an executor of software instructions, the instructions may specifically configure the processor 1112 to perform the operations described herein.
  • the processor 1112 may be embodied as, include or otherwise control the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080.
  • the processor 1112 may be said to cause each of the operations described in connection with the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 by directing the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080, respectively, to undertake the corresponding functionalities responsive to execution of instructions or algorithms configuring the processor 1112 (or processing circuitry 1110) accordingly.
  • These instructions or algorithms may configure the processing circuitry 1110, and thereby also the robotic mower 1010, into a tool for driving the corresponding physical components for performing corresponding functions in the physical world in accordance with the instructions provided.
  • the memory 1114 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable.
  • the memory 1114 may be configured to store information, data, applications, instructions or the like for enabling the vehicle positioning module 1060 and the camera 1070 to carry out various functions in accordance with exemplary embodiments of the present invention.
  • the memory 1114 could be configured to buffer input data for processing by the processor 1112.
  • the memory 1114 could be configured to store instructions for execution by the processor 1112.
  • the memory 1114 may include one or more databases that may store a variety of data sets responsive to input from various sensors or components of the robotic mower 1010.
  • applications may be stored for execution by the processor 1112 in order to carry out the functionality associated with each respective application.
  • the applications may include applications for controlling the robotic mower 1010 relative to various operations including determining an accurate position of the robotic mower 1010 (e.g., using one or more sensors of the vehicle positioning module 1060). Alternatively or additionally, the applications may include applications for controlling the robotic mower 1010 relative to various operations including determining the existence and/or position of obstacles (e.g., static or dynamic) and borders relative to which the robotic mower 1010 must navigate. Alternatively or additionally, the applications may include applications for controlling the robotic mower 1010 relative to various operations to be executed on the parcel 1020. Alternatively or additionally, the applications may include applications for controlling the camera 1070 and/or processing image data gathered by the camera 1070 to execute or facilitate execution of other applications that drive or enhance operation of the robotic mower 1010 relative to various activities described herein. In still other examples, the applications may include instructions for selecting content gathered by the sensor network 1090 (and/or the camera 1070) for publication or broadcast via a network (e.g., wireless communication network 1048).
  • a network e.g., wireless communication network 1048
  • the user interface 1130 may be in communication with the processing circuitry 1110 to receive an indication of a user input at the user interface 1130 and/or to provide an audible, visual, mechanical or other output to the user.
  • the user interface 1130 may include, for example, a display, one or more buttons or keys (e.g., function buttons), and/or other input/output mechanisms (e.g., microphone, speakers, cursor, joystick, lights and/or the like).
  • the device interface 1120 may include one or more interface mechanisms for enabling communication with other devices either locally or remotely.
  • the device interface 1120 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to sensors or other components in communication with the processing circuitry 1110.
  • the device interface 1120 may provide interfaces for communication of data to/from the control circuitry 1012, the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080, the sensor network 1090, and/or other functional components 1100 via wired or wireless communication interfaces in a real-time manner, as a data package downloaded after data gathering or in one or more burst transmission of any kind.
  • Each of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to perform the corresponding functions described herein.
  • the modules may include hardware and/or instructions for execution on hardware (e.g., embedded processing circuitry) that is part of the control circuitry 1012 of the robotic mower 1010.
  • the modules may share some parts of the hardware and/or instructions that form each module, or they may be distinctly formed. As such, the modules and components thereof are not necessarily intended to be mutually exclusive relative to each other from a compositional perspective.
  • the vehicle positioning module 1060 may be configured to utilize one or more sensors (e.g., of the sensor network 1090) to determine a location of the robotic mower 1010 and direct continued motion of the robotic mower 1010 to achieve appropriate coverage of the parcel 1020.
  • the robotic mower 1010 (or more specifically, the control circuitry 1012) may use the location information to determine a mower track and/or provide full coverage of the parcel 1020 to ensure the entire parcel is mowed (or otherwise serviced).
  • the vehicle positioning module 1060 may therefore be configured to direct movement of the robotic mower 1010, including the speed and direction of the robotic mower 1010.
  • the vehicle positioning module 1060 may also employ such sensors to attempt to determine an accurate current location of the robotic mower 1010 on the parcel 1020 (or generally).
  • Various sensors of sensor network 1090 of the robotic mower 1010 may be included as a portion of, or otherwise communicate with, the vehicle positioning module 1060 to, for example, determine vehicle speed/direction, vehicle location, vehicle orientation and/or the like. Sensors may also be used to determine motor run time, machine work time, and other operational parameters.
  • positioning and/or orientation sensors e.g., global positioning system (GPS) receiver and/or accelerometer
  • GPS global positioning system
  • the sensor network 1090 may provide data to the modules described above to facilitate execution of the functions described above, and/or any other functions that the modules may be configurable to perform.
  • the sensor network 1090 may include (perhaps among other things) an inertial measurement unit (IMU) 1150 and a GPS receiver 1152.
  • IMU inertial measurement unit
  • Other possible sensors may include (but are not limited to) a temperature sensor, humidity sensor, barometer, rain gauge, grass sensor, and/or the like.
  • the sensor network 1090 may include independent devices with on-board processing that communicate with the processing circuitry 1110 of the control circuitry 1012 via a single data bus, or via individual communication ports.
  • one or more of the devices of the sensor network 1090 may rely on the processing power of the processing circuitry 1110 of the control circuitry 1012 for the performance of their respective functions.
  • one or more of the sensors of the sensor network 1090 (or portions thereof) may be embodied as portions of the positioning module 1060, the camera 1070 and/or the broadcast module 1080.
  • the IMU 1150 may include one or more and any or all of combinations of accelerometers, odometers, gyroscopes, magnetometers, compasses, and/or the like. As such, the IMU 1150 may be configured to determine velocity, direction, orientation and/or the like so that dead reckoning and/or other inertial navigation determinations can be made by the control circuitry 1012. The IMU 1150 may be enabled to determine changes in pitch, roll and yaw to further facilitate determining terrain features and/or the like.
  • Inertial navigation systems may suffer from integration drift over time. Accordingly, inertial navigation systems may require a periodic position correction, which may be accomplished by getting a position fix from another more accurate method or by fixing a position of the robotic mower 1010 relative to a known location. For example, navigation conducted via the IMU 1150 may be used for robotic mower 1010 operation for a period of time, and then a correction may be inserted when a GPS fix is obtained on robotic mower position. As an example alternative, the IMU 1150 determined position may be updated every time the robotic mower 1010 returns to the charge station 1040 (which may be assumed to be at a fixed location).
  • known reference points may be disposed at one or more locations on the parcel 1020 and the robotic mower 1010 may get a fix relative to any of such known reference points when the opportunity presents itself.
  • the IMU 1150 determined position may then be updated with the more accurate fix information.
  • the GPS receiver 1152 may be embodied as a real time kinematic (RTK) - GPS receiver.
  • the GPS receiver 1152 may employ satellite based positioning in conjunction with GPS, GLONASS, Galileo, GNSS, and/or the like to enhance accuracy of the GPS receiver 1152.
  • carrier-phase enhancement may be employed such that, for example, in addition to the information content of signals received, the phase of the carrier wave may be examined to provide real-time corrections that can enhance accuracy.
  • the robotic mower 1010 may store images of previously encountered objects or other objects that have been learned or identified as known objects.
  • the camera 1070 is able to obtain a new image of the object, the new image can be compared to the stored images to see if a match can be located. If a match is located, the new image may be classified as the known object.
  • the identity and image of the known object may be provided to the broadcast module 80 for processing by the broadcast module 1080 as described herein.
  • the position information gathered by the positioning module 1060 and images obtained by the camera 1070, along with any other data or information obtained via any sensor of the sensor network 1090 may form content that can form the basis of broadcast transmissions that may be arranged and/or managed by the broadcast module 1080.
  • data associated with any aspect of the performance of the robotic mower 1010 or the modules/sensors thereof may form the basis for content that can be formed and utilized for broadcast transmissions arranged and/or managed by the broadcast module 1080.
  • position information such as information indicative of terrain features (e.g., bumps, hills, edges, etc.) that are detectable by the IMU 1150, may form information that can be the basis of broadcast transmissions.
  • Images of, or records relating to encounters with, objects (known or unknown) may also form the basis for broadcast transmissions.
  • Event records and/or activities performed may also form the basis for broadcast transmissions. Other things might also form the basis for such transmissions.
  • the broadcast module 1080 may be configured to identify content (e.g., images, events, activities, data, and/or the like) about which to generate messages and the messages may then be selectively transmitted in a broadcast fashion.
  • the broadcast module 1080 may be configured to generate a message in response to each qualifying event that occurs.
  • the broadcast module 1080 may also be configured to either send the message as a broadcast message or present the message to the operator for the operator to modify and/or release as the broadcast message.
  • the broadcast module 1080 may receive configuration settings from the operator to direct the broadcast module 1080 as to, for example, what events are qualifying events, how and when to prepare messages, and how and when to send such messages or present them to the operator for sending.
  • the configuration settings may provide information identifying qualifying events.
  • qualifying events may be specific images, events, activities, data, and/or the like, that have been designated to trigger message generation.
  • the capture of an image with subsequent identification of a known object in the image may be designated as a qualifying event.
  • the capture of any image, or an image with an unknown object may be designated as a qualifying event.
  • images of specific objects or individuals may be designated as a qualifying event.
  • capture of an image in association with another event or activity may be a qualifying event.
  • qualifying events may also act as qualifying events.
  • qualifying events could be data indicative of certain activities (e.g., energy consumption above or below certain levels, elapsed time above or below certain levels relative to specific events or activities, distance traveled, and/or the like.
  • Qualifying events may be pre-programmed from the factory, or may be set by the operator.
  • the operator may interact with the robotic mower 1010 from the electronic device 1042 to provide configuration settings to identify qualifying events.
  • the configuration settings may define a list of potential qualifying events and the operator may check a box for each item that is to be designated as a qualifying event.
  • the broadcast module 1080 may be configured to format a message.
  • the message may be formatted to include content associated with the corresponding qualifying event.
  • the format of the message may be determined by the qualifying event triggering the message. For example, some qualifying events (e.g., image captures) may trigger message formatting to include the image. Other qualifying events (e.g., data captures) may trigger message formatting to include only text content related to, descriptive of, or otherwise associated with the qualifying event.
  • the operator may define message formats to be used, generally, or for specific qualifying events.
  • the message may be broadcast automatically by the broadcast module 1080 or sent to the operator for screening and/or modification prior to broadcasting.
  • the protocol for broadcasting may be determined based on the configuration settings.
  • the operator may prescribe that all messages are to be broadcast once they are formatted, or that all messages are to be screened by the operator and released prior to being broadcast.
  • the operator may define certain qualifying events (e.g., by type, class or content) that are to be broadcast automatically, and other qualifying events that are to be screened.
  • configuration settings may also define specific mechanisms or mediums for broadcasting messages.
  • the mechanism to be used may also determine, at least in part, the formatting of some messages.
  • the configuration settings may define the broadcast medium to be employed for message broadcasting.
  • Some example broadcast mediums may include SMS, email, Twitter, Facebook, Pinterest, Instagram, Vine, Tumblr, and/or the like.
  • the configuration settings may define the broadcast medium, message formats, content to be added for specific message formats, automatic or operator-prompted release, qualifying event definitions and/or the like as described herein. However, in some cases, the configuration settings may further allow the operator to define specific messages that can be sent when corresponding specific stimuli are encountered. For example, if the IMU 1150 indicates that the robotic mower 1010 was stuck or tipped over, messages such as "I got stuck again today” and “I am upside down” may be prescribed and broadcast responsive to such indications. If the terrain is rough, a message such as "This terrain is bumpy” may be generated. If high power consumption is experienced while cutting, a message such as "My blades are dull” or "You let the grass get too long again” may be generated. Other messages such as “Today, I was harassed by a dog” or "Why am I cutting grass when it is 2 degrees outside” may also be humorously provided when corresponding applicable situations are detected.
  • the broadcast module 1080 may first send an email, SMS or other private message to the operator to obtain clearance and/or release of the message.
  • the operator may review the unreleased message and delete, store or release the message. If the message is released, the message may then be broadcast using the broadcast medium currently prescribed in the configuration settings. However, the broadcast medium could be directly selected (or modified) by the operator.
  • the operator may be enabled to modify any desirable aspects of the message prior to release.
  • the broadcast module 1080 may be instantiated at the robotic mower 1010 in some cases, alternative embodiments may instantiate the broadcast module 1080 in the "cloud.”
  • the robotic mower 1010 may be configured with a transceiver or other communication equipment to enable the robotic mower 1010 to communicate data related to the mowing experience (or parcel transit experience more generally) either directly or indirectly to the wireless communication network 1048.
  • a server 1082 of or in communication with the wireless communication network 1048 may host the broadcast module 1080, which may operate as described above except that the broadcast module 1080 is not locally operating at the robotic mower 1010.
  • Content may then be broadcast from the broadcast module 1080 via the selected broadcast medium to the electronic device 1042 and many other electronic devices of other public users connected to the wireless communication network 1048.
  • experiences from a plurality of robotic mowers or over a period of time may be aggregated at the broadcast module 1080.
  • the aggregated content may then be shared so that experiences in a particular neighborhood or other area can be provided to provide useful information.
  • the useful information may relate to identification of patterns, existence or behavior of animals or vandals, location or activities of missing animals or persons, weather information, soil information, precipitation levels, terrain, owner behavior, product behavior and/or the like.
  • the robotic mower 1010 may be enabled to facilitate the broadcasting of content related to the mowing experience or other yard maintenance activities.
  • the broadcast module 1080 may enable operators to express their personality through configuration of their robotic mower's broadcast activities.
  • the robotic mower 1010 may take on a persona of an electronic pet.
  • useful or fun information relating to mowing or other yard maintenance activities may be shared to increase the interest level, satisfaction or value proposition associated with owning a robotic vehicle such as the robotic mower 1010.
  • Embodiments of the present invention may therefore be practiced using an apparatus such as the one described in reference to FIGS 6-8.
  • some embodiments (or aspects thereof) may be practiced in connection with a computer program product for performing embodiments of the present invention.
  • each block or step of the flowchart of FIGS. 9 and 10, and combinations of blocks in the flowcharts may be implemented by various means, such as hardware, firmware, processor, circuitry and/or another device associated with execution of software including one or more computer program instructions.
  • one or more of the procedures described above may be embodied by computer program instructions, which may embody the procedures described above and may be stored by a storage device (e.g., memory 1114) and executed by processing circuitry (e.g., processor 1112).
  • a storage device e.g., memory 1114
  • processing circuitry e.g., processor 1112
  • any such stored computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s) or step(s).
  • These computer program instructions may also be stored in a computer-readable medium comprising memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions to implement the function specified in the flowchart block(s) or step(s).
  • FIG. 9 illustrates a control flow diagram of one example of how the robotic mower 1010 can be operated in relation to using the sensors thereon to generate content for potential broadcast in accordance with an example embodiment.
  • operation may begin with traversal of the parcel at operation 1400.
  • the sensor network may gather data related to the traversal (or mowing) experiences at operation 1402.
  • the operator may have the option to modify the message at operation 1418. If the message is to be modified, the corresponding modifications may then be inserted (e.g., to form, content, and/or broadcast medium) at operation 1420. If no modifications are desired, or after the modifications are inserted, then the operator may make a determination regarding whether to release the message at operation 1422. If the message is selected for release, then flow may return to operation 1414, as described above. However, if the message is not released, it may be stored or deleted at operation 1424 and flow may return to operations 1400 and 1402.
  • the processes above may incorporate all of position determining, data gathering and message generation/communication, which can be accomplished based on the inclusion of the sensor network 1090 and the modules described above.
  • the robotic mower 1010 may generally operate in accordance with a control method that combines the modules described above to provide a functionally robust robotic vehicle.
  • a method according to example embodiments of the invention may include any or all of the operations shown in FIG. 10.
  • other methods derived from the descriptions provided herein may also be performed responsive to execution of steps associated with such methods by a computer programmed to be transformed into a machine specifically configured to perform such methods.
  • a method for providing broadcast messages from a robotic vehicle may include monitoring data gathered responsive to a robotic vehicle traversing a parcel at operation 1500, determining whether the monitored data is indicative of a qualifying event at operation 1510, generating a message in response to the qualifying event at operation 1520, and processing the message for selective broadcasting at operation 1530.
  • the method may include additional optional operations.
  • the operations 1500-1535 may also be modified, augmented or amplified in some cases.
  • the qualifying event may be defined based on configuration settings that are operator adjustable.
  • a list of potential qualifying events may be provided to the operator to enable the operator to individually select each item of the list that is to be designated as a qualifying event.
  • generating the message in response to the qualifying event further includes receiving configuration settings defining a format for the message.
  • the format of the message may be determined based on the qualifying event.
  • the configuration settings may define associations between message formats and qualifying events.
  • the method may further include receiving operator input defining the associations.
  • processing the message for selective broadcasting may include automatically broadcasting the message based on configuration settings.
  • processing the message for selective broadcasting may include releasing the message for broadcast responsive to screening by the operator.
  • processing the message for selective broadcasting may include releasing the message for broadcast based on operator defined associations identifying a first set of qualifying events for messages to be screened prior to release and a second set of qualifying events for messages to be automatically released.
  • processing the message for selective broadcasting may include sending the message via a broadcast medium defined in configuration settings.
  • processing the message for selective broadcasting may include sending the message via a broadcast medium selected by the operator.
  • an apparatus for performing the method of FIGS. 9 and 10 above may comprise a processor (e.g., the processor 1112) configured to perform some or each of the operations (1400-1530) described above.
  • the processor 1112 may, for example, be configured to perform the operations (1400-1530) by performing hardware implemented logical functions, executing stored instructions, or executing algorithms for performing each of the operations.
  • the apparatus may comprise means for performing each of the operations described above.
  • examples of means for performing operations 1400-1530 may comprise, for example, the control circuitry 1012.
  • processor 1112 may be configured to control or even be embodied as the control circuitry 1012, the processor 1112 and/or a device or circuitry for executing instructions or executing an algorithm for processing information as described above may also form example means for performing operations 1400-1530.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Harvester Elements (AREA)

Abstract

A robotic vehicle may include one or more functional components configured to execute a lawn care function, a sensor network comprising one or more sensors configured to detect conditions proximate to the robotic vehicle, and a monitoring module configured to monitor data gathered via the sensor network responsive to traversing the parcel. The monitoring module may also selectively initiate an alarm or notification function in response to the data indicating the occurrence of a trigger event.

Description

ROBOTIC PATROL VEHICLE
CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims priority to U.S. patent application numbers 62/093,655 filed December 18, 2014 and 62/095,831 filed December 23, 2014, both of which are expressly incorporated by reference in their entirety.
TECHNICAL FIELD
Example embodiments generally relate to robotic vehicles and, more particularly, relate to a robotic vehicle that is configurable to patrol an area and monitor activity within the area.
BACKGROUND
Yard maintenance tasks are commonly performed using various tools and/or machines that are configured for the performance of corresponding specific tasks. Certain tasks, like grass cutting, are typically performed by lawn mowers. Lawn mowers themselves may have many different configurations to support the needs and budgets of consumers. Walk-behind lawn mowers are typically compact, have comparatively small engines and are relatively inexpensive. Meanwhile, at the other end of the spectrum, riding lawn mowers, such as lawn tractors, can be quite large. More recently, robotic mowers and/or remote controlled mowers have also become options for consumers to consider.
Robotic mowers are typically confined to operating on a parcel of land that is bounded by some form of boundary wire. The robotic mower is capable of detecting the boundary wire and operating relatively autonomously within the area defined by the boundary wire. However, the laying of the boundary wire can be a time consuming and difficult task, which operators would prefer to avoid, if possible. That said, to date it has been difficult to try to provide a robotic mower that can truly operate without any need for a boundary wire. Limitations on the accuracy of positioning equipment have played a large role in making this problem difficult to solve.
Additionally, even if it were possible to accurately determine vehicle position, there is currently no comprehensive way to ensure that the robotic vehicle only services the specific areas of a garden or yard that are actually desired for servicing. Given that computing devices are becoming more ubiquitous, it is to be expected that they may be employed to assist in operation of lawn mowers. As such, many additional functionalities may be provided or supported by the employment of computing devices on lawn mowers.
BRIEF SUMMARY OF SOME EXAMPLES
Some example embodiments may therefore provide a robotic vehicle that is configured to incorporate multiple sensors to monitor its environment while on patrol on a particular parcel. The robotic vehicle may be further capable of communicating alarm conditions and/or sharing or storing content generated while on patrol.
Some example embodiments may allow robotic vehicle to perform surveillance functions in addition to typical yard maintenance functions.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
FIG. 1 illustrates an example operating environment for a robotic mower in accordance with an example embodiment;
FIG. 2 illustrates a block diagram of various components of control circuitry to illustrate some of the components that enable or enhance the functional performance of the robotic mower and to facilitate description of an example embodiment;
FIG. 3 illustrates a block diagram of some network components that may be employed as part of a communication network for patrolling a parcel in accordance with an example embodiment;
FIG. 4 illustrates a control flow diagram showing various operations that may be executed in accordance with an example embodiment;
FIG. 5 illustrates a block diagram of one example of a method of monitoring or patrolling a parcel in accordance with an example embodiment;
FIG. 6 illustrates an example operating environment for a robotic mower;
FIG. 7 illustrates a block diagram of various components of control circuitry to illustrate some of the components that enable or enhance the functional performance of the robotic mower and to facilitate description of an example embodiment;
FIG. 8 illustrates a block diagram of some network components that may be employed as part of a communication network for broadcasting messages in accordance with an example embodiment;
FIG. 9 illustrates a control flow diagram showing various operations that may be executed in accordance with an example embodiment; and FIG. 10 illustrates a block diagram of one example of a method of preparing broadcast messages in accordance with an example embodiment.
DETAILED DESCRIPTION
Some example embodiments now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all example embodiments are shown. Indeed, the examples described and pictured herein should not be construed as being limiting as to the scope, applicability or configuration of the present disclosure. Rather, these example embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. Furthermore, as used herein, the term "or" is to be interpreted as a logical operator that results in true whenever one or more of its operands are true. Additionally, the term "yard maintenance" is meant to relate to any outdoor grounds improvement or maintenance related activity and need not specifically apply to activities directly tied to grass, turf or sod care. As used herein, operable coupling should be understood to relate to direct or indirect connection that, in either case, enables functional interconnection of components that are operably coupled to each other.
Robotic mowers, which are one example of a robotic vehicle of an example embodiment, typically mow an area that is defined by a boundary wire that bounds the area to be mowed. The robotic mower then roams within the bounded area to ensure that the entire area is mowed, but the robotic mower does not go outside of the bounded area. The robotic mower typically mows the area in a relatively continuous fashion, pausing only to recharge batteries or when otherwise directed to pause by the operator. Thus, in some ways, the robotic mower may have a near continuous presence in the yard or garden in which it operates.
By placing a number of sensors on the robotic vehicle, the robotic vehicle becomes uniquely capable of performing both its typical work function and surveillance functions. In essence, the robotic vehicle can become a yard or garden sentinel that is ever-alert to situations and events transpiring in and around the yard or garden. Moreover, when fitted with a camera, the robotic vehicle is capable of generating content regarding the yard or garden it is operating within, and such content may be useful for detecting certain activities or aiding investigations into the events surrounding certain occurrences in and around the yard or garden. Example embodiments are therefore described herein to provide various structural and control-related design features that can be employed to improve the capabilities of robotic vehicles (e.g., robotic mowers, mobile sensing devices, watering devices and/or the like) with respect to monitoring and/or recording content related to the encounters and experiences associated with the performance of yard maintenance activities.
FIG. 1 illustrates an example operating environment for a robotic mower 10 that may be employed in connection with an example embodiment. However, it should be appreciated that example embodiments may be employed on numerous other robotic vehicles, so the robotic mower 10 should be recognized as merely one example of such a vehicle. The robotic mower 10 may operate to cut grass on a parcel 20 (i.e., a land lot, yard, or garden), the boundary 30 of which may be defined using one or more physical boundaries (e.g., a fence, wall, curb and/or the like), or programmed location based boundaries or combinations thereof. When the boundary 30 is a detected, by any suitable means, the robotic mower 10 may be informed so that it can operate in a manner that prevents the robotic mower 10 from leaving or moving outside the boundary 30. In some cases, the boundary 30 could be provided by a wire that is detectable by the robotic mower 10.
The robotic mower 10 may be controlled, at least in part, via control circuitry 12 located onboard. The control circuitry 12 may include, among other things, a positioning module and a sensor suite, which will be described in greater detail below. Accordingly, the robotic mower 10 may utilize the control circuitry 12 to define a path for coverage of the parcel 20 in terms of performing a task over specified portions or the entire parcel 20. In this regard, the positioning module may be used to guide the robotic mower 10 over the parcel 20 and to ensure that full coverage (of at least predetermined portions of the parcel 20) is obtained, while the sensor suite may detect objects and/or gather data regarding the surroundings of the robotic mower 10 while the parcel 20 is traversed.
If a sensor suite is employed, the sensor suite may include a sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, a radar transmitter/detector, an ultrasonic sensor, a laser scanner and/or the like). Thus, for example, positional determinations may be made using GPS, inertial navigation, optical flow, radio navigation, visual location (e.g., VSLAM) and/or other positioning techniques or combinations thereof. Accordingly, the sensors may be used, at least in part, for determining the location of the robotic mower 10 relative to boundaries or other points of interest (e.g., a starting point or other key features) of the parcel 20, or determining a position history or track of the robotic mower 10 over time. The sensors may also detect collision, tipping over, or various fault conditions. In some cases, the sensors may also or alternatively collect data regarding various measurable parameters (e.g., moisture, temperature, soil conditions, etc.) associated with particular locations on the parcel 20.
In an example embodiment, the robotic mower 10 may be battery powered via one or more rechargeable batteries. Accordingly, the robotic mower 10 may be configured to return to a charge station 40 that may be located at some position on the parcel 20 in order to recharge the batteries. The batteries may power a drive system and a blade control system of the robotic mower 10. However, the control circuitry 12 of the robotic mower 10 may selectively control the application of power or other control signals to the drive system and/or the blade control system to direct the operation of the drive system and/or blade control system. Accordingly, movement of the robotic mower 10 over the parcel 20 may be controlled by the control circuitry 12 in a manner that enables the robotic mower 10 to systematically traverse the parcel while operating a cutting blade to cut the grass on the parcel 20. In cases where the robotic vehicle is not a mower, the control circuitry 12 may be configured to control another functional or working assembly that may replace the blade control system and blades.
In some embodiments, the control circuitry 12 and/or a communication node at the charge station 40 may be configured to communicate wirelessly with an electronic device 42 (e.g., a personal computer, a cloud based computer, server, mobile telephone, PDA, tablet, smart phone, and/or the like) of a remote operator 44 (or user) via wireless links 46 associated with a wireless communication network 48. The wireless communication network 48 may provide operable coupling between the remote operator 44 and the robotic mower 10 via the electronic device 42, which may act as a remote control device for the robotic mower 10 or may receive data indicative or related to the operation of the robotic mower 10. However, it should be appreciated that the wireless communication network 48 may include additional or internal components that facilitate the communication links and protocols employed. Thus, some portions of the wireless communication network 48 may employ additional components and connections that may be wired and/or wireless. For example, the charge station 40 may have a wired connection to a computer or server that is connected to the wireless communication network 48, which may then wirelessly connect to the electronic device 42. As another example, the robotic mower 10 may wirelessly connect to the wireless communication network 48 (directly or indirectly) and a wired connection may be established between one or more servers of the wireless communication network 48 and a PC of the remote operator 44. In some embodiments, the wireless communication network 48 may be a data network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN) (e.g., the Internet), and/or the like, which may couple the robotic mower 10 to devices such as processing elements (e.g., personal computers, server computers or the like) or databases. Accordingly, communication between the wireless communication network 48 and the devices or databases (e.g., servers, electronic device 42, control circuitry 12) may be accomplished by either wireline or wireless communication mechanisms and corresponding protocols.
FIG. 2 illustrates a block diagram of various components of the control circuitry 12 to illustrate some of the components that enable or enhance the functional performance of the robotic mower 10 and to facilitate description of an example embodiment. In some example embodiments, the control circuitry 12 may include or otherwise be in communication with a vehicle positioning module 60, a camera 70, and a monitoring module 80. The vehicle positioning module 60, the camera 70, and the monitoring module 80 may work together to give the robotic mower 10 a comprehensive understanding of its environment, and enable it to be operated autonomously without boundary wires (or within such wires where they are employed).
Any or all of the vehicle positioning module 60, the camera 70, and the monitoring module 80 may be part of a sensor network 90 of the robotic mower 10. However, in some cases, any or all of the vehicle positioning module 60, the camera 70, and the monitoring module 80 may be separate from but otherwise in communication with the sensor network 90 to facilitate operation of each respective module. The camera 70 may include an electronic image sensor configured to store captured image data (e.g., in memory 114). Image data recorded by the camera 70 may be in the visible light spectrum or in other portions of the electromagnetic spectrum (e.g., IR camera). In some cases, the camera 70 may actually include multiple sensors configured to capture data in different types of images (e.g., RGB and IR sensors). The camera 70 may be configured to capture still images and/or video data.
The robotic mower 10 may also include one or more functional components 100 that may be controlled by the control circuitry 12 or otherwise be operated in connection with the operation of the robotic mower 10. The functional components 100 may include a wheel assembly (or other mobility assembly components), one or more cutting blades and corresponding blade control components, and/or other such devices. In embodiments where the robotic vehicle is not a mower, the functional components 100 may include equipment for performing various lawn care functions such as, for example, taking soil samples, operating valves, distributing water, seed, powder, pellets or chemicals, and/or other functional devices and/or components.
The control circuitry 12 may include processing circuitry 110 that may be configured to perform data processing or control function execution and/or other processing and management services according to an example embodiment of the present invention. In some embodiments, the processing circuitry 110 may be embodied as a chip or chip set. In other words, the processing circuitry 110 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The processing circuitry 110 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
In an example embodiment, the processing circuitry 110 may include one or more instances of a processor 112 and memory 114 that may be in communication with or otherwise control a device interface 120 and, in some cases, a user interface 130. As such, the processing circuitry 110 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein. However, in some embodiments, the processing circuitry 110 may be embodied as a portion of an on-board computer. In some embodiments, the processing circuitry 110 may communicate with electronic components and/or sensors of the robotic mower 10 via a single data bus. As such, the data bus may connect to a plurality or all of the switching components, sensory components and/or other electrically controlled components of the robotic mower 10.
The processor 112 may be embodied in a number of different ways. For example, the processor 112 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like. In an example embodiment, the processor 112 may be configured to execute instructions stored in the memory 114 or otherwise accessible to the processor 112. As such, whether configured by hardware or by a combination of hardware and software, the processor 112 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 110) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor 112 is embodied as an ASIC, FPGA or the like, the processor 112 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 112 is embodied as an executor of software instructions, the instructions may specifically configure the processor 112 to perform the operations described herein.
In an example embodiment, the processor 112 (or the processing circuitry 110) may be embodied as, include or otherwise control the vehicle positioning module 60, the camera 70, and the monitoring module 80. As such, in some embodiments, the processor 112 (or the processing circuitry 110) may be said to cause each of the operations described in connection with the vehicle positioning module 60, the camera 70, and the monitoring module 80 by directing the vehicle positioning module 60, the camera 70, and the monitoring module 80, respectively, to undertake the corresponding functionalities responsive to execution of instructions or algorithms configuring the processor 112 (or processing circuitry 110) accordingly. These instructions or algorithms may configure the processing circuitry 110, and thereby also the robotic mower 10, into a tool for driving the corresponding physical components for performing corresponding functions in the physical world in accordance with the instructions provided.
In an exemplary embodiment, the memory 114 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 114 may be configured to store information, data, applications, instructions or the like for enabling the vehicle positioning module 60 and the camera 70 to carry out various functions in accordance with exemplary embodiments of the present invention. For example, the memory 114 could be configured to buffer input data for processing by the processor 112. Additionally or alternatively, the memory 114 could be configured to store instructions for execution by the processor 112. As yet another alternative, the memory 114 may include one or more databases that may store a variety of data sets responsive to input from various sensors or components of the robotic mower 10. Among the contents of the memory 114, applications may be stored for execution by the processor 112 in order to carry out the functionality associated with each respective application.
The applications may include applications for controlling the robotic mower 10 relative to various operations including determining an accurate position of the robotic mower 10 (e.g., using one or more sensors of the vehicle positioning module 60). Alternatively or additionally, the applications may include applications for controlling the robotic mower 10 relative to various operations including determining the existence and/or position of obstacles (e.g., static or dynamic) and borders relative to which the robotic mower 10 must navigate. Alternatively or additionally, the applications may include applications for controlling the robotic mower 10 relative to various operations to be executed on the parcel 20. Alternatively or additionally, the applications may include applications for controlling the camera 70 and/or processing image data gathered by the camera 70 to execute or facilitate execution of other applications that drive or enhance operation of the robotic mower 10 relative to various activities described herein. In still other examples, the applications may include instructions for gathering images/video or other visual content gathered by the sensor network 90 (and/or the camera 70) for analysis (e.g., by the monitoring module 80) in connection with the performance of monitoring functions described herein. In some cases, the applications may include instructions for analyzing visual content for the generation of alarms, messages, and/or the like (e.g., via a network such as the wireless communication network 48).
The user interface 130 (if implemented) may be in communication with the processing circuitry 110 to receive an indication of a user input at the user interface 130 and/or to provide an audible, visual, mechanical or other output to the user. As such, the user interface 130 may include, for example, a display, one or more buttons or keys (e.g., function buttons), and/or other input/output mechanisms (e.g., microphone, speakers, cursor, joystick, lights and/or the like).
The device interface 120 may include one or more interface mechanisms for enabling communication with other devices either locally or remotely. In some cases, the device interface 120 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to sensors or other components in communication with the processing circuitry 110. In some example embodiments, the device interface 120 may provide interfaces for communication of data to/from the control circuitry 12, the vehicle positioning module 60, the camera 70, and the monitoring module 80, the sensor network 90, and/or other functional components 100 via wired or wireless communication interfaces in a real-time manner, as a data package downloaded after data gathering or in one or more burst transmission of any kind.
Each of the vehicle positioning module 60, the camera 70, and the monitoring module
80 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to perform the corresponding functions described herein. Thus, the modules may include hardware and/or instructions for execution on hardware (e.g., embedded processing circuitry) that is part of the control circuitry 12 of the robotic mower 10. The modules may share some parts of the hardware and/or instructions that form each module, or they may be distinctly formed. As such, the modules and components thereof are not necessarily intended to be mutually exclusive relative to each other from a compositional perspective.
The vehicle positioning module 60 (or "positioning module") may be configured to utilize one or more sensors (e.g., of the sensor network 90) to determine a location of the robotic mower 10 and direct continued motion of the robotic mower 10 to achieve appropriate coverage of the parcel 20. As such, the robotic mower 10 (or more specifically, the control circuitry 12) may use the location information to determine a mower track and/or provide full coverage of the parcel 20 to ensure the entire parcel is mowed (or otherwise serviced). The vehicle positioning module 60 may therefore be configured to direct movement of the robotic mower 10, including the speed and direction of the robotic mower 10. The vehicle positioning module 60 may also employ such sensors to attempt to determine an accurate current location of the robotic mower 10 on the parcel 20 (or generally).
Various sensors of sensor network 90 of the robotic mower 10 may be included as a portion of, or otherwise communicate with, the vehicle positioning module 60 to, for example, determine vehicle speed/direction, vehicle location, vehicle orientation and/or the like. Sensors may also be used to determine motor run time, machine work time, and other operational parameters. In some embodiments, positioning and/or orientation sensors (e.g., global positioning system (GPS) receiver and/or accelerometer) may be included to monitor, display and/or record data regarding vehicle position and/or orientation as part of the vehicle positioning module 60.
In an example embodiment, the sensor network 90 may provide data to the modules described above to facilitate execution of the functions described above, and/or any other functions that the modules may be configurable to perform. In some cases, the sensor network 90 may include (perhaps among other things) an inertial measurement unit (IMU) 150 and a GPS receiver 152. Other possible sensors may include (but are not limited to) a temperature sensor, an object detector (e.g., time-of-flight ranging devices), a humidity sensor, a barometer, a rain gauge, a grass sensor, and/or the like. Generally speaking, the sensor network 90 may include independent devices with on-board processing that communicate with the processing circuitry 110 of the control circuitry 12 via a single data bus, or via individual communication ports. However, in some cases, one or more of the devices of the sensor network 90 may rely on the processing power of the processing circuitry 110 of the control circuitry 12 for the performance of their respective functions. As such, in some cases, one or more of the sensors of the sensor network 90 (or portions thereof) may be embodied as portions of the positioning module 60, the camera 70 and/or the monitoring module 80.
The IMU 150 may include one or more and any or all of combinations of accelerometers, odometers, gyroscopes, magnetometers, compasses, and/or the like. As such, the IMU 150 may be configured to determine velocity, direction, orientation and/or the like so that dead reckoning and/or other inertial navigation determinations can be made by the control circuitry 12. The IMU 150 may be enabled to determine changes in pitch, roll and yaw to further facilitate determining terrain features and/or the like.
Inertial navigation systems may suffer from integration drift over time. Accordingly, inertial navigation systems may require a periodic position correction, which may be accomplished by getting a position fix from another more accurate method or by fixing a position of the robotic mower 10 relative to a known location. For example, navigation conducted via the IMU 150 may be used for robotic mower 10 operation for a period of time, and then a correction may be inserted when a GPS fix is obtained on robotic mower position. As an example alternative, the IMU 150 determined position may be updated every time the robotic mower 10 returns to the charge station 40 (which may be assumed to be at a fixed location). In still other examples, known reference points may be disposed at one or more locations on the parcel 20 and the robotic mower 10 may get a fix relative to any of such known reference points when the opportunity presents itself. The IMU 150 determined position may then be updated with the more accurate fix information.
In some embodiments, the GPS receiver 152 may be embodied as a real time kinematic (RTK) - GPS receiver. As such, the GPS receiver 152 may employ satellite based positioning in conjunction with GPS, GLONASS, Galileo, GNSS, and/or the like to enhance accuracy of the GPS receiver 152. In some cases, carrier-phase enhancement may be employed such that, for example, in addition to the information content of signals received, the phase of the carrier wave may be examined to provide real-time corrections that can enhance accuracy.
The monitoring module 80 may be configured to receive position information from the positioning module 60 and image (or video) data from the camera 70. In some cases, the monitoring module 80 may also receive data and/or information from other sensors of the sensor network 90. The monitoring module 80 may be further configured to examine and/or analyze the data/information received, which may generally be referred to as monitoring data for the occurrence of trigger events. The monitoring module 80 may also or alternatively provide for storage of the monitoring data so that the monitoring data can be analyzed and/or retrieved for analysis at a later time, if needed or desired. In some embodiments, the storage of monitoring data may be provided by memory 114. However, remote storage is also possible after communication of the monitoring data to the wireless communication network 48.
In an example embodiment, the monitoring module 80 may be configured in accordance with monitoring instructions. The monitoring instructions may define specific activities, patterns, events and/or the like that qualify as trigger events. In some cases, the monitoring instructions may be factory pre- set. However, as an alternative, the operator may change, add or otherwise modify the monitoring instructions either locally at the robotic mower 10 or via a remote interface (e.g., using the electronic device 42).
In some cases, the monitoring module 80 may store images of previously encountered objects or other objects that have been learned or identified as known objects. When an object is encountered during operation of the robotic mower 10, if the camera 70 is able to obtain a new image of the object, the new image can be compared to the stored images to see if a match can be located. If a match is located, the new image may be classified as the known object. However, the detection of an unknown object may qualify as a trigger event. Thus, for example, if no match can be obtained for on object in an image (i.e., an unknown object), the object may be classified as unknown and the failure to locate a match may be a trigger event.
In some cases, detection of dynamic (rather than fixed or static) objects may be a trigger event. Thus, for example, if the camera 70 or any other sensor is able to detect an object that appears to be moving, the detection may be classified as a trigger event. In still other examples, detection of objects at certain times or via certain sensors may be trigger events. For example, since IR cameras may enable surveillance to be conducted at night, it may be possible to have any detections of objects (static or dynamic) at night or via the IR camera to be classified as trigger events. In some examples, dynamic objects may be classified as or otherwise correlate to living objects, and the detection of living objects may be a trigger event. Meanwhile, detection of static objects (non-living objects, permanent objects, and/or the like) may not be a trigger event. However, it should be appreciated that detection and classification of living and non-living objects could also be accomplished using the camera 70 to find matching images.
In some embodiments, detections associated with certain times and/or certain locations may be trigger events. For example, detecting an object proximate to the house or windows of the house, or by a back door, at night or other specified times may be a trigger event. Still other combinations of detections relating to specific object locations, times, sensors, or other factors may be separately designated as trigger events by the monitoring instructions. As but one more example, a thermal imaging camera may detect living things, fire or overheated objects based on temperature. However, fire and some other significant events could be detected without a thermal camera as well. In any case, if such events are detected, they may be designated as trigger events by the monitoring instructions.
The monitoring instructions may not only define trigger events, but may also define the response or responses that are to be initiated in response to a trigger event. Accordingly, the monitoring instructions may define trigger responses to correspond to trigger events, either generally or in specific terms. Thus, for example, a general response such as "initiate alarm" or any other suitable response may be defined for all trigger events. However, in other cases, more targeted responses may be defined for individual trigger events. For example, if an animal is detected that is to be scared away (e.g., a garden pest), then a local audible alarm at the robotic mower 10 may be initiated. Similarly, if someone or something is interfering with robotic mower operation, a local alarm may be generated. However, if an intruder or trespasser is detected, it may be desirable to initiate a local alarm, or to initiate a remote alarm or notification (e.g., to the owner, to the police, or to other interested parties) with or without a local audible alarm.
If an alarm activation or initiation on a local level is a desirable response, the monitoring module 82 may include an alarm 82. The alarm 82 may be an audible local alarm such as a whistle, beep, siren and/or the like. Moreover, in some cases, the alarm 82 may include lights, vibration or other augmentation. If the alarm activation or initiation is instead to be provided remotely, the response to the trigger event may be a notification to trigger a remote alarm at the operator's house or business, at the police or fire station, at an alarm monitoring service, and/or the like.
Accordingly, in an example embodiment, the monitoring module 80 may be configured to identify activity based on monitoring data (e.g., image data, events or activities, and/or the like) about which to generate notifications, messages and/or alarms either locally or for transmission to corresponding defined entities or locations.
Although the monitoring module 80 may be instantiated at the robotic mower 10 in some cases, alternative embodiments may instantiate the monitoring module 80 in the "cloud." Thus, for example, as shown in FIG. 3, the robotic mower 10 may be configured with a transceiver or other communication equipment to enable the robotic mower 10 to communicate data related to the environment being monitored (e.g., the parcel 20) either directly or indirectly to the wireless communication network 48. A server 82 of or in communication with the wireless communication network 48 may host the monitoring module 80, which may operate as described above except that the monitoring module 80 is not locally operating at the robotic mower 10. Notifications may then be transmitted from the monitoring module 80 to the electronic device 42 (which may be associated with the operator or with any of a number of first responders or other entities) via the wireless communication network 48. Notifications may be formatted as SMS messages, emails, or other proprietary formatted messages. In some cases, the notifications may be alarm signals, phone calls, or any other message capable of communicating the corresponding information regarding an event occurrence and, in some cases, content (e.g., image and/or video data) corresponding to the event occurrence (i.e., the event that acted as the trigger event).
In some embodiments, when an instance of the monitoring module 80 is embodied at the server 82, experiences from a plurality of robotic mowers in various different locations may be monitored via the monitoring module 80. The monitoring module 80 may therefore monitor activity at a plurality of parcels and provide security or monitoring services for each such parcel. Each parcel may have a known address, and therefore, if an alarm or notification is generated, such alarm or notification may be provided in reference to the corresponding parcel (e.g., by address), and/or may be provided to one or more individuals or entities associated (e.g., by registration to a service) with the parcel. By incorporating the sensor network 90 and the modules described above, the robotic mower 10 may be enabled to facilitate the monitoring of the environment of a robotic mower 10 while engaged in the mowing experience or other yard maintenance activities.
Embodiments of the present invention may therefore be practiced using an apparatus such as the one described in reference to FIGS 1-3. However, it should also be appreciated that some embodiments (or aspects thereof) may be practiced in connection with a computer program product for performing embodiments of the present invention. As such, for example, each block or step of the flowchart of FIGS. 4 and 5, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware, firmware, processor, circuitry and/or another device associated with execution of software including one or more computer program instructions. Thus, for example, one or more of the procedures described above may be embodied by computer program instructions, which may embody the procedures described above and may be stored by a storage device (e.g., memory 114) and executed by processing circuitry (e.g., processor 112). As will be appreciated, any such stored computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s) or step(s). These computer program instructions may also be stored in a computer-readable medium comprising memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions to implement the function specified in the flowchart block(s) or step(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block(s) or step(s).
FIG. 4 illustrates a control flow diagram of one example of how the robotic mower 10 can be operated in relation to using the sensors thereon to generate content for potential broadcast in accordance with an example embodiment. As shown in FIG. 4, operation may begin with traversal of the parcel at operation 400. During the traversal, the sensor network may gather data related to the traversal (or mowing) experiences at operation 402. A determination may be made based on the data gathered as to whether any trigger event has occurred at operation 404. If there are no trigger events, then flow may cycle back to the traversal and data gathering operations 400 and 402, which may continue. If a trigger event has occurred at operation 404, then a determination may be made as to whether the trigger event corresponds to an alarm condition at operation 406. If an alarm condition has occurred, then a local or remote alarm may be sounded at operation 408, and flow may return to operation 400. If no alarm condition has occurred, then a determination may be made as to whether a notification is required for the trigger event at operation 410. If a notification is required, then a message or other notification may be sent at operation 412 and flow may return to operation 400. If notification is not required, the information associated with the monitoring may be stored at operation 414 and flow may return to operation 400. In some cases, notifications or alarm conditions may not be triggered on one occurrence. Thus, the storage of information at operation 414 may enable later trigger event determinations and/or alarm/notification decisions to be made in light of both current and past events.
Of note, the processes above may incorporate all of position determining, data gathering and message generation/communication, which can be accomplished based on the inclusion of the sensor network 90 and the modules described above. As such, in some cases, the robotic mower 10 may generally operate in accordance with a control method that combines the modules described above to provide a functionally robust robotic vehicle. In this regard, a method according to example embodiments of the invention may include any or all of the operations shown in FIG. 5. Moreover, other methods derived from the descriptions provided herein may also be performed responsive to execution of steps associated with such methods by a computer programmed to be transformed into a machine specifically configured to perform such methods.
In an example embodiment, a method for monitoring a parcel based on operation of a robotic vehicle (e.g., a mower or watering device), as shown in FIG. 5, may include monitoring data gathered responsive to a robotic vehicle traversing a parcel at operation 500, determining whether the monitored data is indicative of a qualifying event at operation 510, and selectively initiating an alarm or notification function in response to the data indicating occurrence of a trigger event at operation 520. In some cases, the method may include additional optional operations, an example of which is shown in dashed lines in FIG. 5. In this regard, the method may further include receiving monitoring instructions defining one or more trigger events at operation 530.
Furthermore, the operations 500-530 may also be modified, augmented or amplified in some cases. For example, in some embodiments, the qualifying event may be defined based on configuration settings that are operator adjustable. In an example embodiment, receiving the monitoring instructions may include receiving the monitoring instructions responsive to operator adjustment either locally at the robotic vehicle or via a remote interface. In an example embodiment, receiving the monitoring instructions may include receiving instructions defining specific activities, patterns, or events that qualify as trigger events. In some cases, one of the sensors of the sensor network may be a camera. In such an example, the monitoring module may be configured, responsive to detection of an object in an image captured via the camera, to compare the image to a plurality of stored images to determine if a match for the object can be located. The trigger event may occur based at least on part on whether the match is located. In some embodiments, the trigger event may occur responsive to detection of an object at a predefined time or location, or via a predetermined sensor. In an example embodiment, receiving the monitoring instructions may include receiving instructions defining at least one response to be initiated in response to the trigger event. In some cases, the monitoring instructions may initiate a local or remote alarm in response to the trigger event. In an example embodiment, data gathered may also be stored to determine if a repeated number or pattern of events has occurred, which corresponds to the trigger event.
In an example embodiment, an apparatus for performing the method of FIGS. 4 and 5 above may comprise a processor (e.g., the processor 112) configured to perform some or each of the operations (400-530) described above. The processor 112 may, for example, be configured to perform the operations (400-530) by performing hardware implemented logical functions, executing stored instructions, or executing algorithms for performing each of the operations. Alternatively, the apparatus may comprise means for performing each of the operations described above. In this regard, according to an example embodiment, examples of means for performing operations 400-530 may comprise, for example, the control circuitry 12. Additionally or alternatively, at least by virtue of the fact that the processor 112 may be configured to control or even be embodied as the control circuitry 12, the processor 112 and/or a device or circuitry for executing instructions or executing an algorithm for processing information as described above may also form example means for performing operations 400-530.
Some example embodiments may therefore provide a robotic vehicle that is configured to incorporate multiple sensors to experience various aspects of its environment while performing a task on a particular parcel. The robotic vehicle may be further capable of broadcasting information about its experiences.
Some example embodiments may improve the ability of operators to interact with the robotic vehicle in a unique and interesting way.
Robotic mowers, which are one example of a robotic vehicle of an example embodiment, typically mow an area that is defined by a boundary wire that bounds the area to be mowed. The robotic mower then roams within the bounded area to ensure that the entire area is mowed, but the robotic mower does not go outside of the bounded area.
By placing a number of sensors on the robotic vehicle, the robotic vehicle becomes uniquely capable of generating interesting content regarding the yard or garden it is operating within. Example embodiments are therefore described herein to provide various structural and control-related design features that can be employed to improve the capabilities of robotic vehicles (e.g., robotic mowers, mobile sensing devices, watering devices and/or the like) with respect to broadcasting content related to the encounters and experiences associated with yard maintenance activities.
FIG. 6 illustrates an example operating environment for a robotic mower 1010 that may be employed in connection with an example embodiment. However, it should be appreciated that example embodiments may be employed on numerous other robotic vehicles, so the robotic mower 1010 should be recognized as merely one example of such a vehicle. The robotic mower 1010 may operate to cut grass on a parcel 1020 (i.e., a land lot, yard, or garden), the boundary 1030 of which may be defined using one or more physical boundaries (e.g., a fence, wall, curb and/or the like), or programmed location based boundaries or combinations thereof. When the boundary 1030 is a detected, by any suitable means, the robotic mower 1010 may be informed so that it can operate in a manner that prevents the robotic mower 1010 from leaving or moving outside the boundary 1030. In some cases, the boundary 1030 could be provided by a wire that is detectable by the robotic mower 1010.
The robotic mower 1010 may be controlled, at least in part, via control circuitry 1012 located onboard. The control circuitry 1012 may include, among other things, a positioning module and a sensor suite, which will be described in greater detail below. Accordingly, the robotic mower 1010 may utilize the control circuitry 1012 to define a path for coverage of the parcel 1020 in terms of performing a task over specified portions or the entire parcel 1020. In this regard, the positioning module may be used to guide the robotic mower 1010 over the parcel 1020 and to ensure that full coverage (of at least predetermined portions of the parcel 1020) is obtained, while the sensor suite may detect objects and/or gather data regarding the surroundings of the robotic mower 1010 while the parcel 1020 is traversed.
If a sensor suite is employed, the sensor suite may include a sensors related to positional determination (e.g., a GPS receiver, an accelerometer, a camera, a radar transmitter/detector, an ultrasonic sensor, a laser scanner and/or the like). Thus, for example, positional determinations may be made using GPS, inertial navigation, optical flow, radio navigation, visual location (e.g., VSLAM) and/or other positioning techniques or combinations thereof. Accordingly, the sensors may be used, at least in part, for determining the location of the robotic mower 1010 relative to boundaries or other points of interest (e.g., a starting point or other key features) of the parcel 1020, or determining a position history or track of the robotic mower 1010 over time. The sensors may also detect collision, tipping over, or various fault conditions. In some cases, the sensors may also or alternatively collect data regarding various measurable parameters (e.g., moisture, temperature, soil conditions, etc.) associated with particular locations on the parcel 1020.
In an example embodiment, the robotic mower 1010 may be battery powered via one or more rechargeable batteries. Accordingly, the robotic mower 1010 may be configured to return to a charge station 1040 that may be located at some position on the parcel 1020 in order to recharge the batteries. The batteries may power a drive system and a blade control system of the robotic mower 1010. However, the control circuitry 1012 of the robotic mower 1010 may selectively control the application of power or other control signals to the drive system and/or the blade control system to direct the operation of the drive system and/or blade control system. Accordingly, movement of the robotic mower 1010 over the parcel 1020 may be controlled by the control circuitry 1012 in a manner that enables the robotic mower 1010 to systematically traverse the parcel while operating a cutting blade to cut the grass on the parcel 1020. In cases where the robotic vehicle is not a mower, the control circuitry 1012 may be configured to control another functional or working assembly that may replace the blade control system and blades.
In some embodiments, the control circuitry 1012 and/or a communication node at the charge station 1040 may be configured to communicate wirelessly with an electronic device 1042 (e.g., a personal computer, a cloud based computer, server, mobile telephone, PDA, tablet, smart phone, and/or the like) of a remote operator 1044 (or user) via wireless links 1046 associated with a wireless communication network 1048. The wireless communication network 1048 may provide operable coupling between the remote operator 1044 and the robotic mower 1010 via the electronic device 1042, which may act as a remote control device for the robotic mower 1010 or may receive data indicative or related to the operation of the robotic mower 1010. However, it should be appreciated that the wireless communication network 1048 may include additional or internal components that facilitate the communication links and protocols employed. Thus, some portions of the wireless communication network 1048 may employ additional components and connections that may be wired and/or wireless. For example, the charge station 1040 may have a wired connection to a computer or server that is connected to the wireless communication network 1048, which may then wirelessly connect to the electronic device 1042. As another example, the robotic mower 1010 may wirelessly connect to the wireless communication network 1048 (directly or indirectly) and a wired connection may be established between one or more servers of the wireless communication network 1048 and a PC of the remote operator 1044. In some embodiments, the wireless communication network 1048 may be a data network, such as a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN) (e.g., the Internet), and/or the like, which may couple the robotic mower 1010 to devices such as processing elements (e.g., personal computers, server computers or the like) or databases. Accordingly, communication between the wireless communication network 1048 and the devices or databases (e.g., servers, electronic device 1042, control circuitry 1012) may be accomplished by either wireline or wireless communication mechanisms and corresponding protocols.
FIG. 7 illustrates a block diagram of various components of the control circuitry 1012 to illustrate some of the components that enable or enhance the functional performance of the robotic mower 1010 and to facilitate description of an example embodiment. In some example embodiments, the control circuitry 1012 may include or otherwise be in communication with a vehicle positioning module 1060, a camera 1070, and a broadcast module 1080. The vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may work together to give the robotic mower 1010 a comprehensive understanding of its environment, and enable it to be operated autonomously without boundary wires (or within such wires where they are employed).
Any or all of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be part of a sensor network 1090 of the robotic mower 1010. However, in some cases, any or all of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be separate from but otherwise in communication with the sensor network 1090 to facilitate operation of each respective module. The camera 1070 may include an electronic image sensor configured to store captured image data (e.g., in memory 1114). Image data recorded by the camera 1070 may be in the visible light spectrum or in other portions of the electromagnetic spectrum (e.g., IR camera). In some cases, the camera 1070 may actually include multiple sensors configured to capture data in different types of images (e.g., RGB and IR sensors). The camera 1070 may be configured to capture still images and/or video data.
The robotic mower 1010 may also include one or more functional components 1100 that may be controlled by the control circuitry 1012 or otherwise be operated in connection with the operation of the robotic mower 1010. The functional components 1100 may include a wheel assembly (or other mobility assembly components), one or more cutting blades and corresponding blade control components, and/or other such devices. In embodiments where the robotic vehicle is not a mower, the functional components 1100 may include equipment for performing various lawn care functions such as, for example, taking soil samples, operating valves, distributing water, seed, powder, pellets or chemicals, and/or other functional devices and/or components.
The control circuitry 1012 may include processing circuitry 1110 that may be configured to perform data processing or control function execution and/or other processing and management services according to an example embodiment of the present invention. In some embodiments, the processing circuitry 1110 may be embodied as a chip or chip set. In other words, the processing circuitry 1110 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The processing circuitry 1110 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
In an example embodiment, the processing circuitry 1110 may include one or more instances of a processor 1112 and memory 1114 that may be in communication with or otherwise control a device interface 1120 and, in some cases, a user interface 1130. As such, the processing circuitry 1110 may be embodied as a circuit chip (e.g., an integrated circuit chip) configured (e.g., with hardware, software or a combination of hardware and software) to perform operations described herein. However, in some embodiments, the processing circuitry 1110 may be embodied as a portion of an on-board computer. In some embodiments, the processing circuitry 1110 may communicate with electronic components and/or sensors of the robotic mower 1010 via a single data bus. As such, the data bus may connect to a plurality or all of the switching components, sensory components and/or other electrically controlled components of the robotic mower 1010.
The processor 1112 may be embodied in a number of different ways. For example, the processor 1112 may be embodied as various processing means such as one or more of a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), or the like. In an example embodiment, the processor 1112 may be configured to execute instructions stored in the memory 1114 or otherwise accessible to the processor 1112. As such, whether configured by hardware or by a combination of hardware and software, the processor 1112 may represent an entity (e.g., physically embodied in circuitry - in the form of processing circuitry 1110) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor 1112 is embodied as an ASIC, FPGA or the like, the processor 1112 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 1112 is embodied as an executor of software instructions, the instructions may specifically configure the processor 1112 to perform the operations described herein.
In an example embodiment, the processor 1112 (or the processing circuitry 1110) may be embodied as, include or otherwise control the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080. As such, in some embodiments, the processor 1112 (or the processing circuitry 1110) may be said to cause each of the operations described in connection with the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 by directing the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080, respectively, to undertake the corresponding functionalities responsive to execution of instructions or algorithms configuring the processor 1112 (or processing circuitry 1110) accordingly. These instructions or algorithms may configure the processing circuitry 1110, and thereby also the robotic mower 1010, into a tool for driving the corresponding physical components for performing corresponding functions in the physical world in accordance with the instructions provided.
In an exemplary embodiment, the memory 1114 may include one or more non- transitory memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. The memory 1114 may be configured to store information, data, applications, instructions or the like for enabling the vehicle positioning module 1060 and the camera 1070 to carry out various functions in accordance with exemplary embodiments of the present invention. For example, the memory 1114 could be configured to buffer input data for processing by the processor 1112. Additionally or alternatively, the memory 1114 could be configured to store instructions for execution by the processor 1112. As yet another alternative, the memory 1114 may include one or more databases that may store a variety of data sets responsive to input from various sensors or components of the robotic mower 1010. Among the contents of the memory 1114, applications may be stored for execution by the processor 1112 in order to carry out the functionality associated with each respective application.
The applications may include applications for controlling the robotic mower 1010 relative to various operations including determining an accurate position of the robotic mower 1010 (e.g., using one or more sensors of the vehicle positioning module 1060). Alternatively or additionally, the applications may include applications for controlling the robotic mower 1010 relative to various operations including determining the existence and/or position of obstacles (e.g., static or dynamic) and borders relative to which the robotic mower 1010 must navigate. Alternatively or additionally, the applications may include applications for controlling the robotic mower 1010 relative to various operations to be executed on the parcel 1020. Alternatively or additionally, the applications may include applications for controlling the camera 1070 and/or processing image data gathered by the camera 1070 to execute or facilitate execution of other applications that drive or enhance operation of the robotic mower 1010 relative to various activities described herein. In still other examples, the applications may include instructions for selecting content gathered by the sensor network 1090 (and/or the camera 1070) for publication or broadcast via a network (e.g., wireless communication network 1048).
The user interface 1130 (if implemented) may be in communication with the processing circuitry 1110 to receive an indication of a user input at the user interface 1130 and/or to provide an audible, visual, mechanical or other output to the user. As such, the user interface 1130 may include, for example, a display, one or more buttons or keys (e.g., function buttons), and/or other input/output mechanisms (e.g., microphone, speakers, cursor, joystick, lights and/or the like).
The device interface 1120 may include one or more interface mechanisms for enabling communication with other devices either locally or remotely. In some cases, the device interface 1120 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to sensors or other components in communication with the processing circuitry 1110. In some example embodiments, the device interface 1120 may provide interfaces for communication of data to/from the control circuitry 1012, the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080, the sensor network 1090, and/or other functional components 1100 via wired or wireless communication interfaces in a real-time manner, as a data package downloaded after data gathering or in one or more burst transmission of any kind.
Each of the vehicle positioning module 1060, the camera 1070, and the broadcast module 1080 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to perform the corresponding functions described herein. Thus, the modules may include hardware and/or instructions for execution on hardware (e.g., embedded processing circuitry) that is part of the control circuitry 1012 of the robotic mower 1010. The modules may share some parts of the hardware and/or instructions that form each module, or they may be distinctly formed. As such, the modules and components thereof are not necessarily intended to be mutually exclusive relative to each other from a compositional perspective.
The vehicle positioning module 1060 (or "positioning module") may be configured to utilize one or more sensors (e.g., of the sensor network 1090) to determine a location of the robotic mower 1010 and direct continued motion of the robotic mower 1010 to achieve appropriate coverage of the parcel 1020. As such, the robotic mower 1010 (or more specifically, the control circuitry 1012) may use the location information to determine a mower track and/or provide full coverage of the parcel 1020 to ensure the entire parcel is mowed (or otherwise serviced). The vehicle positioning module 1060 may therefore be configured to direct movement of the robotic mower 1010, including the speed and direction of the robotic mower 1010. The vehicle positioning module 1060 may also employ such sensors to attempt to determine an accurate current location of the robotic mower 1010 on the parcel 1020 (or generally).
Various sensors of sensor network 1090 of the robotic mower 1010 may be included as a portion of, or otherwise communicate with, the vehicle positioning module 1060 to, for example, determine vehicle speed/direction, vehicle location, vehicle orientation and/or the like. Sensors may also be used to determine motor run time, machine work time, and other operational parameters. In some embodiments, positioning and/or orientation sensors (e.g., global positioning system (GPS) receiver and/or accelerometer) may be included to monitor, display and/or record data regarding vehicle position and/or orientation as part of the vehicle positioning module 1060.
In an example embodiment, the sensor network 1090 may provide data to the modules described above to facilitate execution of the functions described above, and/or any other functions that the modules may be configurable to perform. In some cases, the sensor network 1090 may include (perhaps among other things) an inertial measurement unit (IMU) 1150 and a GPS receiver 1152. Other possible sensors may include (but are not limited to) a temperature sensor, humidity sensor, barometer, rain gauge, grass sensor, and/or the like. Generally speaking, the sensor network 1090 may include independent devices with on-board processing that communicate with the processing circuitry 1110 of the control circuitry 1012 via a single data bus, or via individual communication ports. However, in some cases, one or more of the devices of the sensor network 1090 may rely on the processing power of the processing circuitry 1110 of the control circuitry 1012 for the performance of their respective functions. As such, in some cases, one or more of the sensors of the sensor network 1090 (or portions thereof) may be embodied as portions of the positioning module 1060, the camera 1070 and/or the broadcast module 1080.
The IMU 1150 may include one or more and any or all of combinations of accelerometers, odometers, gyroscopes, magnetometers, compasses, and/or the like. As such, the IMU 1150 may be configured to determine velocity, direction, orientation and/or the like so that dead reckoning and/or other inertial navigation determinations can be made by the control circuitry 1012. The IMU 1150 may be enabled to determine changes in pitch, roll and yaw to further facilitate determining terrain features and/or the like.
Inertial navigation systems may suffer from integration drift over time. Accordingly, inertial navigation systems may require a periodic position correction, which may be accomplished by getting a position fix from another more accurate method or by fixing a position of the robotic mower 1010 relative to a known location. For example, navigation conducted via the IMU 1150 may be used for robotic mower 1010 operation for a period of time, and then a correction may be inserted when a GPS fix is obtained on robotic mower position. As an example alternative, the IMU 1150 determined position may be updated every time the robotic mower 1010 returns to the charge station 1040 (which may be assumed to be at a fixed location). In still other examples, known reference points may be disposed at one or more locations on the parcel 1020 and the robotic mower 1010 may get a fix relative to any of such known reference points when the opportunity presents itself. The IMU 1150 determined position may then be updated with the more accurate fix information.
In some embodiments, the GPS receiver 1152 may be embodied as a real time kinematic (RTK) - GPS receiver. As such, the GPS receiver 1152 may employ satellite based positioning in conjunction with GPS, GLONASS, Galileo, GNSS, and/or the like to enhance accuracy of the GPS receiver 1152. In some cases, carrier-phase enhancement may be employed such that, for example, in addition to the information content of signals received, the phase of the carrier wave may be examined to provide real-time corrections that can enhance accuracy.
In some cases, the robotic mower 1010 may store images of previously encountered objects or other objects that have been learned or identified as known objects. When an object is encountered during operation of the robotic mower 1010, if the camera 1070 is able to obtain a new image of the object, the new image can be compared to the stored images to see if a match can be located. If a match is located, the new image may be classified as the known object. In some cases, the identity and image of the known object may be provided to the broadcast module 80 for processing by the broadcast module 1080 as described herein.
The position information gathered by the positioning module 1060 and images obtained by the camera 1070, along with any other data or information obtained via any sensor of the sensor network 1090 may form content that can form the basis of broadcast transmissions that may be arranged and/or managed by the broadcast module 1080. Moreover, in some cases, data associated with any aspect of the performance of the robotic mower 1010 or the modules/sensors thereof may form the basis for content that can be formed and utilized for broadcast transmissions arranged and/or managed by the broadcast module 1080. For example, position information such as information indicative of terrain features (e.g., bumps, hills, edges, etc.) that are detectable by the IMU 1150, may form information that can be the basis of broadcast transmissions. Images of, or records relating to encounters with, objects (known or unknown) may also form the basis for broadcast transmissions. Event records and/or activities performed may also form the basis for broadcast transmissions. Other things might also form the basis for such transmissions.
In any case, the broadcast module 1080 may be configured to identify content (e.g., images, events, activities, data, and/or the like) about which to generate messages and the messages may then be selectively transmitted in a broadcast fashion. In an example embodiment, the broadcast module 1080 may be configured to generate a message in response to each qualifying event that occurs. The broadcast module 1080 may also be configured to either send the message as a broadcast message or present the message to the operator for the operator to modify and/or release as the broadcast message. In an example embodiment, the broadcast module 1080 may receive configuration settings from the operator to direct the broadcast module 1080 as to, for example, what events are qualifying events, how and when to prepare messages, and how and when to send such messages or present them to the operator for sending.
In an example embodiment, the configuration settings may provide information identifying qualifying events. In some cases, qualifying events may be specific images, events, activities, data, and/or the like, that have been designated to trigger message generation. For example, in some cases, the capture of an image with subsequent identification of a known object in the image may be designated as a qualifying event. In some examples, the capture of any image, or an image with an unknown object may be designated as a qualifying event. In still other examples, images of specific objects or individuals may be designated as a qualifying event. In some embodiments, capture of an image in association with another event or activity may be a qualifying event. Specific events associated with other sensors (e.g., tipping over, bumpy terrain, getting stuck, weather events, and/or the like), alone or in combination with other stimuli, may also act as qualifying events. In some cases, qualifying events could be data indicative of certain activities (e.g., energy consumption above or below certain levels, elapsed time above or below certain levels relative to specific events or activities, distance traveled, and/or the like. Qualifying events may be pre-programmed from the factory, or may be set by the operator. In some cases, the operator may interact with the robotic mower 1010 from the electronic device 1042 to provide configuration settings to identify qualifying events. Moreover, in some embodiments, the configuration settings may define a list of potential qualifying events and the operator may check a box for each item that is to be designated as a qualifying event.
Regardless of how qualifying events are specified, when a qualifying event occurs, the broadcast module 1080 may be configured to format a message. The message may be formatted to include content associated with the corresponding qualifying event. Moreover, in some cases, the format of the message may be determined by the qualifying event triggering the message. For example, some qualifying events (e.g., image captures) may trigger message formatting to include the image. Other qualifying events (e.g., data captures) may trigger message formatting to include only text content related to, descriptive of, or otherwise associated with the qualifying event. In some cases, the operator may define message formats to be used, generally, or for specific qualifying events.
In an example embodiment, once the message has been formatted, the message may be broadcast automatically by the broadcast module 1080 or sent to the operator for screening and/or modification prior to broadcasting. The protocol for broadcasting may be determined based on the configuration settings. Thus, for example, the operator may prescribe that all messages are to be broadcast once they are formatted, or that all messages are to be screened by the operator and released prior to being broadcast. Alternatively or additionally, the operator may define certain qualifying events (e.g., by type, class or content) that are to be broadcast automatically, and other qualifying events that are to be screened.
In an example embodiment, configuration settings may also define specific mechanisms or mediums for broadcasting messages. The mechanism to be used may also determine, at least in part, the formatting of some messages. Accordingly, in some examples, the configuration settings may define the broadcast medium to be employed for message broadcasting. Some example broadcast mediums may include SMS, email, Twitter, Facebook, Pinterest, Instagram, Vine, Tumblr, and/or the like.
The configuration settings may define the broadcast medium, message formats, content to be added for specific message formats, automatic or operator-prompted release, qualifying event definitions and/or the like as described herein. However, in some cases, the configuration settings may further allow the operator to define specific messages that can be sent when corresponding specific stimuli are encountered. For example, if the IMU 1150 indicates that the robotic mower 1010 was stuck or tipped over, messages such as "I got stuck again today" and "I am upside down" may be prescribed and broadcast responsive to such indications. If the terrain is rough, a message such as "This terrain is bumpy" may be generated. If high power consumption is experienced while cutting, a message such as "My blades are dull" or "You let the grass get too long again" may be generated. Other messages such as "Today, I was harassed by a dog" or "Why am I cutting grass when it is 2 degrees outside" may also be humorously provided when corresponding applicable situations are detected.
As mentioned above, in some cases, the broadcast module 1080 may first send an email, SMS or other private message to the operator to obtain clearance and/or release of the message. When the unreleased message is received by the operator, the operator may review the unreleased message and delete, store or release the message. If the message is released, the message may then be broadcast using the broadcast medium currently prescribed in the configuration settings. However, the broadcast medium could be directly selected (or modified) by the operator. Moreover, as mentioned above, the operator may be enabled to modify any desirable aspects of the message prior to release.
Although the broadcast module 1080 may be instantiated at the robotic mower 1010 in some cases, alternative embodiments may instantiate the broadcast module 1080 in the "cloud." Thus, for example, as shown in FIG. 8, the robotic mower 1010 may be configured with a transceiver or other communication equipment to enable the robotic mower 1010 to communicate data related to the mowing experience (or parcel transit experience more generally) either directly or indirectly to the wireless communication network 1048. A server 1082 of or in communication with the wireless communication network 1048 may host the broadcast module 1080, which may operate as described above except that the broadcast module 1080 is not locally operating at the robotic mower 1010. Content may then be broadcast from the broadcast module 1080 via the selected broadcast medium to the electronic device 1042 and many other electronic devices of other public users connected to the wireless communication network 1048.
In some embodiments, when an instance of the broadcast module 1080 is embodied at the server 1082, experiences from a plurality of robotic mowers or over a period of time may be aggregated at the broadcast module 1080. The aggregated content may then be shared so that experiences in a particular neighborhood or other area can be provided to provide useful information. The useful information may relate to identification of patterns, existence or behavior of animals or vandals, location or activities of missing animals or persons, weather information, soil information, precipitation levels, terrain, owner behavior, product behavior and/or the like.
By incorporating the sensor network 1090 and the modules described above, the robotic mower 1010 may be enabled to facilitate the broadcasting of content related to the mowing experience or other yard maintenance activities. Moreover, the broadcast module 1080 may enable operators to express their personality through configuration of their robotic mower's broadcast activities. In some cases, the robotic mower 1010 may take on a persona of an electronic pet. In any case, useful or fun information relating to mowing or other yard maintenance activities may be shared to increase the interest level, satisfaction or value proposition associated with owning a robotic vehicle such as the robotic mower 1010.
Embodiments of the present invention may therefore be practiced using an apparatus such as the one described in reference to FIGS 6-8. However, it should also be appreciated that some embodiments (or aspects thereof) may be practiced in connection with a computer program product for performing embodiments of the present invention. As such, for example, each block or step of the flowchart of FIGS. 9 and 10, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware, firmware, processor, circuitry and/or another device associated with execution of software including one or more computer program instructions. Thus, for example, one or more of the procedures described above may be embodied by computer program instructions, which may embody the procedures described above and may be stored by a storage device (e.g., memory 1114) and executed by processing circuitry (e.g., processor 1112).
As will be appreciated, any such stored computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which execute on the computer or other programmable apparatus implement the functions specified in the flowchart block(s) or step(s). These computer program instructions may also be stored in a computer-readable medium comprising memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instructions to implement the function specified in the flowchart block(s) or step(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block(s) or step(s). FIG. 9 illustrates a control flow diagram of one example of how the robotic mower 1010 can be operated in relation to using the sensors thereon to generate content for potential broadcast in accordance with an example embodiment. As shown in FIG. 9, operation may begin with traversal of the parcel at operation 1400. During the traversal, the sensor network may gather data related to the traversal (or mowing) experiences at operation 1402. A determination may be made based on the data gathered as to whether any qualifying event has occurred at operation 1404. If there are no qualifying events, then flow may cycle back to the traversal and data gathering operations 1400 and 1402, which may continue. If a qualifying event has occurred at operation 1404, then a determination may be made as to whether a format has been prescribed for message generation at operation 1406. If no format is prescribed, then a default format may be used for message generation at operation 1408. If a format is prescribed, then the message may be formatted accordingly. Whether a selected format or default format is employed, a determination may then be made at operation 1412 as to whether the configuration settings provide for automatic release of the message. If automatic release is authorized, then the message may be released via the defined broadcast medium at operation 1414. If automatic release is not authorized, then the message may be presented to the operator at operation 1416.
The operator may have the option to modify the message at operation 1418. If the message is to be modified, the corresponding modifications may then be inserted (e.g., to form, content, and/or broadcast medium) at operation 1420. If no modifications are desired, or after the modifications are inserted, then the operator may make a determination regarding whether to release the message at operation 1422. If the message is selected for release, then flow may return to operation 1414, as described above. However, if the message is not released, it may be stored or deleted at operation 1424 and flow may return to operations 1400 and 1402.
Of note, the processes above may incorporate all of position determining, data gathering and message generation/communication, which can be accomplished based on the inclusion of the sensor network 1090 and the modules described above. As such, in some cases, the robotic mower 1010 may generally operate in accordance with a control method that combines the modules described above to provide a functionally robust robotic vehicle. In this regard, a method according to example embodiments of the invention may include any or all of the operations shown in FIG. 10. Moreover, other methods derived from the descriptions provided herein may also be performed responsive to execution of steps associated with such methods by a computer programmed to be transformed into a machine specifically configured to perform such methods.
In an example embodiment, a method for providing broadcast messages from a robotic vehicle (e.g., a mower or watering device), as shown in FIG. 10, may include monitoring data gathered responsive to a robotic vehicle traversing a parcel at operation 1500, determining whether the monitored data is indicative of a qualifying event at operation 1510, generating a message in response to the qualifying event at operation 1520, and processing the message for selective broadcasting at operation 1530.
In some cases, the method may include additional optional operations. Thus, for example, the operations 1500-1535 may also be modified, augmented or amplified in some cases. For example, in some embodiments, the qualifying event may be defined based on configuration settings that are operator adjustable. In an example embodiment, a list of potential qualifying events may be provided to the operator to enable the operator to individually select each item of the list that is to be designated as a qualifying event. In some cases, generating the message in response to the qualifying event further includes receiving configuration settings defining a format for the message. In some embodiments, the format of the message may be determined based on the qualifying event. In an example embodiment, the configuration settings may define associations between message formats and qualifying events. In some examples, the method may further include receiving operator input defining the associations. In some cases, the method may further include adding content to the message based on the qualifying event. In some embodiments, processing the message for selective broadcasting may include automatically broadcasting the message based on configuration settings. In an example embodiment, processing the message for selective broadcasting may include releasing the message for broadcast responsive to screening by the operator. In some examples, processing the message for selective broadcasting may include releasing the message for broadcast based on operator defined associations identifying a first set of qualifying events for messages to be screened prior to release and a second set of qualifying events for messages to be automatically released. In some embodiments, processing the message for selective broadcasting may include sending the message via a broadcast medium defined in configuration settings. In an example embodiment, processing the message for selective broadcasting may include sending the message via a broadcast medium selected by the operator.
In an example embodiment, an apparatus for performing the method of FIGS. 9 and 10 above may comprise a processor (e.g., the processor 1112) configured to perform some or each of the operations (1400-1530) described above. The processor 1112 may, for example, be configured to perform the operations (1400-1530) by performing hardware implemented logical functions, executing stored instructions, or executing algorithms for performing each of the operations. Alternatively, the apparatus may comprise means for performing each of the operations described above. In this regard, according to an example embodiment, examples of means for performing operations 1400-1530 may comprise, for example, the control circuitry 1012. Additionally or alternatively, at least by virtue of the fact that the processor 1112 may be configured to control or even be embodied as the control circuitry 1012, the processor 1112 and/or a device or circuitry for executing instructions or executing an algorithm for processing information as described above may also form example means for performing operations 1400-1530.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe exemplary embodiments in the context of certain exemplary combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. In cases where advantages, benefits or solutions to problems are described herein, it should be appreciated that such advantages, benefits and/or solutions may be applicable to some example embodiments, but not necessarily all example embodiments. Thus, any advantages, benefits or solutions described herein should not be thought of as being critical, required or essential to all embodiments or to that which is claimed herein. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

THAT WHICH IS CLAIMED:
1. A robotic vehicle comprising:
one or more functional components configured to execute a lawn care function; a sensor network comprising one or more sensors configured to detect conditions proximate to the robotic vehicle;
and
a monitoring module configured to monitor data gathered via the sensor network responsive to traversing the parcel,
wherein the monitoring module selectively initiates an alarm or notification function in response to the data indicating the occurrence of a trigger event.
2. The robotic vehicle of claim 1, wherein the trigger event is defined based on monitoring instructions configuring the monitoring module.
3. The robotic vehicle of claim 2, wherein the monitoring instructions are operator adjustable.
4. The robotic vehicle of claim 3, wherein the monitoring instructions are operator adjustable locally at the robotic vehicle.
5. The robotic vehicle of claim 3, wherein the monitoring instructions are operator adjustable via a remote interface.
6. The robotic vehicle of claim 2, wherein the monitoring instructions define specific activities, patterns, or events that qualify as trigger events.
7. The robotic vehicle of claim 1, wherein one of the sensors of the sensor network comprises a camera,
wherein, responsive to detection of an object in an image captured via the camera, the monitoring module is configured to compare the image to a plurality of stored images to determine if a match for the object can be located, and
wherein the trigger event occurs based at least on part on whether the match is located.
8. The robotic vehicle of claim 1, wherein the trigger event occurs responsive to detection of an object at a predefined time or location, or via a predetermined sensor.
9. The robotic vehicle of any preceding claim, wherein the monitoring instructions define at least one response to be initiated in response to the trigger event.
10. The robotic vehicle of any preceding claim, wherein the monitoring module is configured to initiate a local alarm in response to the trigger event.
11. The robotic vehicle of any preceding claim, wherein the monitoring module is configured to initiate a remote alarm in response to the trigger event.
12. The robotic vehicle of any preceding claim, wherein the monitoring module is configured to store data to determine if a repeated number or pattern of events corresponds to the trigger event.
13. The robotic vehicle of any preceding claim, further comprising a positioning module configured to determine robotic vehicle position while the robotic vehicle traverses a parcel.
14. The robotic vehicle of any preceding claim, wherein the monitoring module is configured to define the trigger event based on distinguishing between living and non-living objects.
15. A method comprising:
monitoring data gathered responsive to a robotic vehicle traversing a parcel;
determining whether the monitored data is indicative of a qualifying event; and selectively initiating an alarm or notification function in response to the data indicating occurrence of a trigger event.
16. The method of claim 15, further comprising receiving monitoring instructions defining one or more trigger events.
17. The method of claim 16, wherein receiving the monitoring instructions comprises receiving the monitoring instructions responsive to operator adjustment.
18. The method of claim 17, wherein receiving the monitoring instructions comprises receiving the monitoring instructions locally at the robotic vehicle.
19. The method of claim 17, wherein receiving the monitoring instructions comprises receiving the monitoring instructions via a remote interface.
20. The method of claim 17, wherein receiving the monitoring instructions comprises receiving instructions defining specific activities, patterns, or events that qualify as trigger events.
21. The method of claim 15, wherein one of the sensors of the sensor network comprises a camera,
wherein, responsive to detection of an object in an image captured via the camera, the monitoring module is configured to compare the image to a plurality of stored images to determine if a match for the object can be located, and
wherein the trigger event occurs based at least on part on whether the match is located.
22. The method of claim 15, wherein the trigger event occurs responsive to detection of an object at a predefined time or location, or via a predetermined sensor.
23. The method of any of claims 15-22, wherein receiving the monitoring instructions comprises receiving instructions defining at least one response to be initiated in response to the trigger event.
24. The method of any of claims 15-23, wherein the monitoring instructions initiate a local alarm in response to the trigger event.
25. The method of any of claims 15-24, wherein the monitoring module initiate a remote alarm in response to the trigger event.
26. The method of any of claims 15-25, further comprising storing data to determine if a repeated number or pattern of events corresponds to the trigger event.
27. The method of any of claims 15-26, further comprising a defining the trig event based on distinguishing between living and non-living objects.
PCT/IB2015/058669 2014-12-18 2015-11-10 Robotic patrol vehicle WO2016097897A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462093655P 2014-12-18 2014-12-18
US62/093,655 2014-12-18
US201462095831P 2014-12-23 2014-12-23
US62/095,831 2014-12-23

Publications (1)

Publication Number Publication Date
WO2016097897A1 true WO2016097897A1 (en) 2016-06-23

Family

ID=54695801

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/058669 WO2016097897A1 (en) 2014-12-18 2015-11-10 Robotic patrol vehicle

Country Status (1)

Country Link
WO (1) WO2016097897A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018123632A1 (en) * 2016-12-28 2018-07-05 Honda Motor Co.,Ltd. Control device, monitoring device and control program
WO2020159101A1 (en) 2019-01-28 2020-08-06 Lg Electronics Inc. Artificial intelligence moving robot and method for controlling the same
WO2020228262A1 (en) * 2019-05-15 2020-11-19 苏州科瓴精密机械科技有限公司 Method for controlling autonomous mobile robot, and autonomous mobile robot system
US20210029872A1 (en) * 2019-08-02 2021-02-04 Embankscape Equipment LLC Handheld Unit with Safety Features for Remote-Control Slope Mowing System
DE102020003610B3 (en) 2020-06-17 2021-09-02 Christian Bleser Retrofit kit with detection system to avoid nocturnal animals colliding with robotic lawn mowers
US11172605B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
US11172608B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
EP3778145A4 (en) * 2018-04-06 2022-01-26 LG Electronics Inc. Mobile robot and control method of mobile robot
US11350563B2 (en) 2018-05-25 2022-06-07 The Toro Company Autonomous grounds maintenance machines with path planning for trap and obstacle avoidance
SE2150184A1 (en) * 2021-02-22 2022-08-23 Husqvarna Ab Robotic work tool assistance in a robotic work tool system
US11696535B2 (en) 2018-11-28 2023-07-11 The Toro Company Autonomous ground surface treatment system and method of operation of such a system
US11785883B2 (en) 2019-08-02 2023-10-17 Embankscape Equipment LLC Remote-control slope mowing system with inclinometers and safety features

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013100938A1 (en) * 2011-12-28 2013-07-04 Husqvarna Consumer Outdoor Products N.A., Inc. Lawn care vehicle with modular ride information system
WO2014003644A2 (en) * 2012-06-26 2014-01-03 Husqvarna Ab Detachable user interface for a robotic vehicle
WO2014027945A1 (en) * 2012-08-14 2014-02-20 Husqvarna Ab Mower with object detection system
WO2014120893A1 (en) * 2013-02-01 2014-08-07 Husqvarna Ab Method and device for identifying a power equipment work pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013100938A1 (en) * 2011-12-28 2013-07-04 Husqvarna Consumer Outdoor Products N.A., Inc. Lawn care vehicle with modular ride information system
WO2014003644A2 (en) * 2012-06-26 2014-01-03 Husqvarna Ab Detachable user interface for a robotic vehicle
WO2014027945A1 (en) * 2012-08-14 2014-02-20 Husqvarna Ab Mower with object detection system
WO2014120893A1 (en) * 2013-02-01 2014-08-07 Husqvarna Ab Method and device for identifying a power equipment work pattern

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11172605B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
US11832552B2 (en) 2016-06-30 2023-12-05 Techtronic Outdoor Products Technology Limited Autonomous lawn mower and a system for navigating thereof
US11172608B2 (en) 2016-06-30 2021-11-16 Tti (Macao Commercial Offshore) Limited Autonomous lawn mower and a system for navigating thereof
JP2018109849A (en) * 2016-12-28 2018-07-12 本田技研工業株式会社 Control device, monitoring device and control program
CN110073305A (en) * 2016-12-28 2019-07-30 本田技研工业株式会社 Control device, monitoring device and control program
WO2018123632A1 (en) * 2016-12-28 2018-07-05 Honda Motor Co.,Ltd. Control device, monitoring device and control program
US10864891B2 (en) 2016-12-28 2020-12-15 Honda Motor Co., Ltd. Control device, monitoring device and control program
EP3778145A4 (en) * 2018-04-06 2022-01-26 LG Electronics Inc. Mobile robot and control method of mobile robot
US11350563B2 (en) 2018-05-25 2022-06-07 The Toro Company Autonomous grounds maintenance machines with path planning for trap and obstacle avoidance
US11832553B2 (en) 2018-05-25 2023-12-05 The Toro Company Autonomous grounds maintenance machines with path planning for trap and obstacle avoidance
US11696535B2 (en) 2018-11-28 2023-07-11 The Toro Company Autonomous ground surface treatment system and method of operation of such a system
EP3917726A4 (en) * 2019-01-28 2022-10-19 LG Electronics Inc. Artificial intelligence moving robot and method for controlling the same
WO2020159101A1 (en) 2019-01-28 2020-08-06 Lg Electronics Inc. Artificial intelligence moving robot and method for controlling the same
WO2020228262A1 (en) * 2019-05-15 2020-11-19 苏州科瓴精密机械科技有限公司 Method for controlling autonomous mobile robot, and autonomous mobile robot system
US11653593B2 (en) * 2019-08-02 2023-05-23 Embankscape Equipment LLC Handheld unit with safety features for remote-control slope mowing system
US11785883B2 (en) 2019-08-02 2023-10-17 Embankscape Equipment LLC Remote-control slope mowing system with inclinometers and safety features
US20210029872A1 (en) * 2019-08-02 2021-02-04 Embankscape Equipment LLC Handheld Unit with Safety Features for Remote-Control Slope Mowing System
DE102020003610B3 (en) 2020-06-17 2021-09-02 Christian Bleser Retrofit kit with detection system to avoid nocturnal animals colliding with robotic lawn mowers
SE2150184A1 (en) * 2021-02-22 2022-08-23 Husqvarna Ab Robotic work tool assistance in a robotic work tool system
WO2022177486A1 (en) * 2021-02-22 2022-08-25 Husqvarna Ab Robotic work tool assistance in a robotic work tool system
SE545454C2 (en) * 2021-02-22 2023-09-19 Husqvarna Ab Robotic work tool assistance in a robotic work tool system

Similar Documents

Publication Publication Date Title
WO2016097897A1 (en) Robotic patrol vehicle
US11666010B2 (en) Lawn monitoring and maintenance via a robotic vehicle
EP3237983B1 (en) Robotic vehicle grass structure detection
US9563204B2 (en) Mower with object detection system
US10806075B2 (en) Multi-sensor, autonomous robotic vehicle with lawn care function
US10643377B2 (en) Garden mapping and planning via robotic vehicle
US10777000B2 (en) Garden street view
EP3971672A1 (en) Multi-sensor, autonomous robotic vehicle with mapping capability
US10448565B2 (en) Garden visualization and mapping via robotic vehicle
WO2016097891A1 (en) Robotic vehicle for detecting gps shadow zones
US11112532B2 (en) Weather collection and aggregation via robotic vehicle
WO2016098040A1 (en) Robotic vehicle with automatic camera calibration capability
US10849267B2 (en) Remote interaction with a robotic vehicle
EP3656197A1 (en) Yard maintenance vehicle obstacle avoidance/notification system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15798242

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15798242

Country of ref document: EP

Kind code of ref document: A1