WO2016054332A1 - Activites d'animal assistees - Google Patents

Activites d'animal assistees Download PDF

Info

Publication number
WO2016054332A1
WO2016054332A1 PCT/US2015/053433 US2015053433W WO2016054332A1 WO 2016054332 A1 WO2016054332 A1 WO 2016054332A1 US 2015053433 W US2015053433 W US 2015053433W WO 2016054332 A1 WO2016054332 A1 WO 2016054332A1
Authority
WO
WIPO (PCT)
Prior art keywords
animal
touch
fluffy
sensitive surface
activity
Prior art date
Application number
PCT/US2015/053433
Other languages
English (en)
Inventor
Nissim Shani
Roni SHANI
Daniel SHANI
Original Assignee
Forget You Not, LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/504,104 external-priority patent/US9185885B2/en
Application filed by Forget You Not, LLC filed Critical Forget You Not, LLC
Publication of WO2016054332A1 publication Critical patent/WO2016054332A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/025Toys specially adapted for animals
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K15/00Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
    • A01K15/02Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
    • A01K15/021Electronic training devices specially adapted for dogs or cats

Definitions

  • This description relates to assisted animal activities.
  • Dogs and cats are capable of a limited amount (compared to humans) of communication with human beings, and with other animals. Dogs and cats can make sounds and engage in motions that are believed to communicate their wishes, needs, reactions, and feelings. In addition, they are believed to be capable of interpreting sounds, fragrances, odors, images, scenes, motions, and other stimuli as communications to them. This limited communication ability can form the basis of strong bonds, among other things. The love of pet owners for their pets and of the pets for their owners is well-known.
  • Animals are also capable of engaging in a wide variety of activities that could be characterized as training, games and other entertainment, therapy, and others. These activities and others can be facilitated by other animals and human owners, trainers, and handlers.
  • Electronic devices such as sound systems, televisions, and display monitors attached to computers can be used to play audio and video material that may be considered entertaining for pets.
  • a system in an aspect, includes a touch-sensitive surface and a camera mounted to be movable relative to the touch-sensitive surface to acquire an image of an animal interacting with the touch-sensitive surface.
  • the system includes a processor coupled to a memory, the processor and memory configured to cause each of multiple sections of the display screen to be associated with a corresponding activity in which the animal can engage; detect a selection of one of the multiple sections of the touch-sensitive surface; and enable the activity associated with the selected section of the touch-sensitive surface.
  • Embodiments can include one or more of the following features.
  • the touch-sensitive surface comprises a display screen.
  • the system includes a display for displaying content associated with the activity.
  • the system includes a mount disposed around at least some of the display screen, the camera mounted on the mount.
  • the activity comprises creating an item of visual art.
  • the processor and memory are configured to enable the animal to interact with the touch-sensitive surface to indicate elements of the visual art being created.
  • the processor and memory are configured to generate the visual art based on a position of the animal on the touch-sensitive surface, a type of touch of the animal on the touch- sensitive surface, a pressure of the animal's touch on the touch-sensitive surface, or a combination of any two or more of them.
  • the activity comprises creating a work of music.
  • the processor and memory are configured to combine sounds generated by the animal with a previously created audio file.
  • the activity comprises watching a video or listening to audio.
  • the activity comprises playing.
  • the activity comprises training the animal.
  • the activity comprises an activity directed to a person associated with the animal.
  • the touch-sensitive surface is sized such that the animal can select one of the multiple sections by walking on the touch-sensitive surface.
  • the system includes a bottom camera disposed below the touch-sensitive surface.
  • the processor and memory are configured to send, to a printer, data representative of an image of the animal interacting with the touch-sensitive surface, data representative of an image of an item of visual art created by the animal's interaction with the touch-sensitive surface, or both.
  • the the processor and memory are configured to determine, based on the animal's interaction with the touch-sensitive surface, when to send the data to the printer. The determination is made based on an amount of time the animal interacted with the touch-sensitive surface, a degree to which the animal completed the activity, or both.
  • the data comprise information sufficient to print a two-dimensional representation of the animal or the item of visual art or a three- dimensional representation of the animal or the item of visual art.
  • the processor and memory are configured to send, to a computing device, data representative of an image of the animal interacting with the touch-sensitive surface, data representative of an image of an item of visual art created by the animal's interaction with the touch-sensitive surface, or both.
  • the processor and memory are configured to determine, based on the animal's interaction with the touch- sensitive surface, when to send the data to the computing device.
  • the processor and memory are configured to monitor a physical characteristic of the animal.
  • a method includes cause each of multiple sections of a touch-sensitive surface to be associated with a corresponding activity of an animal; detect a selection of one of the multiple sections of the touch-sensitive surface; activate the activity associated with the selected section of the touch-sensitive surface; and acquiring multiple images of the animal interacting with the touch-sensitive surface during the activated activity.
  • Embodiments can include one or more of the following features.
  • the method includes displaying content associated with the activity on a display screen associated with the touch-sensitive surface.
  • the touch-sensitive surface comprises the display screen.
  • Detecting a selection of the one of the multiple sections comprises detecting an interaction of the animal with the touch-sensitive surface.
  • the method includes sending, to a printer, data representative of an image of the animal interacting with the touch-sensitive screen, data representative of an image of an item of visual art created by the animal during the activated activity, or both.
  • the method includes determining when to send the data to the printer based on the animal's interaction with the touch-sensitive surface.
  • the data comprise information sufficient to print a three-dimensional representation of the animal or the item of visual art.
  • FIGS 1 through 15 are block diagrams.
  • assisted animal communication 10 can be an important aspect of assisted animal activities.
  • assisted animal communication 10 includes communication between an animal 12 and a human 14, for example, a dog or cat and its owner.
  • the communication that we describe here can be used in connection with training, games, entertainment, therapy, and other kinds of activities to be engaged in by the animals alone, with other animals, or with humans.
  • animal broadly to include, for example, non-human animals of any kind that are capable of some degree of communication, including primates, dogs, cats, other mammals, and pets, to name a few.
  • communication broadly to include, for example, any conveying or comprehension of information, emotions, thoughts, or actions between a human and an animal by any mechanism, device, or capability (which we will sometimes refer to as artifacts of communication) including any respect in which the human or animal can move, act, behave, make noise, or otherwise affect its environment (we sometimes refer to these as produced artifacts of communication 19) and any respect in which the human or animal can hear, see, taste, smell, touch, or otherwise sense its environment (we sometimes refer to these as sensed artifacts of communication 21).
  • Produced artifacts and sensed artifacts can relate to voice, sound, image, motion, activity, fragrance, odor, stimuli, and others, and combinations of them.
  • assisted broadly to include, for example, any device or method that aids, supplements, processes, accelerates, and in any other way helps or enhances the conveying of information, emotions, thoughts, or actions between a human and an animal.
  • technology broadly to include, for example, any mechanical, electrical, computer, network, wireless, and other devices and techniques, or combinations of them.
  • the source of the communication can vary the characteristics of produced artifacts of the communication (such as the volume, duration, frequency, pitch, and style of Fluffy's barking) to impart meaning that, by experience, seems to be understood by the recipient (Fluffy has learned that a certain kind of barking will communicate to George that Fluffy wants to go outside).
  • the produced artifacts can be interpreted using speech recognition or recognition of sounds made by animals.
  • the recipient of the communication can interpret the characteristics of the received artifacts as indicating the nature of the communication (George has learned, by experience, that a certain kind of barking by Fluffy means she wants to go outside).
  • the sender of the communication learns to formulate the artifacts to represent the communication so that it will be understood effectively and the recipient learns to interpret the correct meaning of the communication represented by the artifacts.
  • the technology that we describe here can assist animal communication in a wide variety of ways.
  • the technology assists animal communication by providing a store-and-forward function for produced artifacts.
  • the technology can include devices 23, 25 that receive, sense, or capture the produced artifacts of communication from the sender and reproduce and deliver versions of the artifacts to the recipient for whom they become sensed artifacts.
  • a microphone one of the devices 23
  • Fluffy the place where Fluffy is located
  • the barking can then be delivered through a speaker (one of the devices 25) at the place where George is located (even at the same place where Fluffy is located, in some examples).
  • the received artifacts can be passed through artifact processing devices 27 and the process artifacts can then be delivered to the recipient by devices 25.
  • Artifact processing devices 27 can include any kind of device that is capable of receiving, storing, analyzing, altering, enhancing, processing, and sending information about artifacts.
  • artifacts can be retransmitted through a network to pass them to the recipient located at a different location than the sender.
  • artifacts can be stored temporarily or for an extended period of time and then retransmitted to the recipient. For example, video and text derived from or related to the video or a picture and a message derived from or related to the picture can be delivered to a recipient, either the human or the animal.
  • Fluffy's barking at 8 AM at his home in Marblehead, Massachusetts can be recorded, processed to enhance the quality of the recording, and then stored. Later, when George wakes up at 8 AM on his business trip to Palo Alto, California, the technology can deliver the sound to George's cell phone at his hotel room.
  • the artifact processing devices 27 can include interpreters 29 that convert, interpret, translate, or otherwise derive the meaning of the communication 31 represented by received artifacts, based on information available to the processing devices.
  • the information available to the processing devices can include the produced artifacts 19, previously produced artifacts of the same sender 33 (for example recordings of Fluffy barking when she wanted to go outside), previously produced artifacts of other senders 35 (for example, recordings of other dogs barking when they want to go outside, or videos of a large number of dogs engaging in various kinds of behavior, such as barking, pacing, and running), other produced artifacts of the same sender 37 (for example, videos of Fluffy tapping her paw against the door when she wants to go outside), other produced artifacts of other senders 39 (for example, videos of other dogs tapping their paws against doors when they want to go outside), supplemental information 41 provided by human beings about the behavior of the sender (for example, information provided by George that Fluffy barks and taps
  • Users of the technology 20 can provide some of the supplemental information 41 through interface features of client devices to define, explain, illustrate, or otherwise capture relationships between artifacts of communication and interpretations of the artifacts and between communications and artifacts that can be used to articulate them.
  • George could take videos of Fluffy pacing back and forth at times when Fluffy needs to be walked, submit the videos, and add the information that the videos represent Fluffy needing to be walked.
  • the artifacts could be the pacing back and forth, the speed of pacing, the duration of pacing, the extent of pacing, and other characteristics, which can be reflected directly in the video or explicitly identified by text entry by the user.
  • the corresponding interpreted communication would be that Fluffy needs to be walked.
  • George could add video and a voice overlay to the video indicating the meaning of what is shown in the video.
  • the interpreter can use the previously entered video and other information as the basis for interpreting the pacing as indicating that Fluffy needs to be walked, if the current pacing is found to match to some degree the previously provided pacing videos.
  • the interpreter can use a variety of mathematical and statistical techniques and models.
  • a wide variety of other kinds of information can be provided by a user such as information about the animal (size, age, species, favorite foods, and behavior, to name a few), information about the human, information about the environment in which the animal or the human is doing the communicating, and others.
  • the supplemental information can include information about a class of animals to which the animal belongs, habits of that class of animals, behavioral patterns, and many others.
  • the technology 20 can generate its own information useful in later interpretation, for example, by storing information acquired through client devices and interface features during the operation of the technology, and by storing its analysis and interpretations of that information.
  • the technology 30 could acquire and store videos, images, audio recordings, text, and other information obtained from animals and humans, could associate artifacts of communication that occurred in those stored items with interpretations of the communications based on explicit confirmations provided by the humans based on inference, and could store artifacts that were used to correctly articulate communications, and combinations of those activities.
  • the technology 20 can also embed the voice of the owner, or a picture of the owner and the voice of the owner that reflect interpretations of the content of the artifacts of communication
  • the interpreters 29 can include processes 43 that use the available information to derive the meaning of the communication 31.
  • the processes can include algorithms, inference engines, models, and a wide variety of other mathematical, logical, and other processes, and combinations of them.
  • the meaning can be conveyed to the recipient of the communication in a form different from the received artifacts. For example, if the interpreter determines that the meaning of certain behavior by Fluffy is that Fluffy wants to go outside, a text message or e-mail or alarm sound can be sent in text or some other form to a desktop computer at George's office to tell him that Fluffy wants to go outside. In some implementations, the meaning can be conveyed to the recipient at the same time as the artifacts (by playing back Fluffy's barking at the same time that the message is displayed to George, for example).
  • the meaning of the communication can be saved in association with the artifacts that relate to the meaning.
  • the association between the meaning and the artifact can then be used by the interpreters 29 to improve the quality and speed of their interpretation of the meaning of other artifacts received from a sender.
  • the meaning of the communication can be used for a wide variety of purposes other than the direct delivery of the meaning to the recipient. For example, suppose that the meaning of a certain kind of barking done by Fluffy is that Fluffy wants to be fed a
  • CrunchyLunch biscuit The relationship of that meaning to the certain kind of barking might be aided by information provided by the manufacturer of CrunchyLunch biscuits based on prior associations with certain kinds of dog barking and its product.
  • the manufacturer could make an arrangement to pay the host of the technology, each time the meaning of the communication has been determined as "feed me a CrunchyLunch biscuit", to send a message to George saying "Fluffy wants to be fed a CrunchyLunch biscuit".
  • the manufacturer could pay the host of the technology to send an online coupon to buy CrunchyLunch biscuits for one dollar off the normal price of a box.
  • artifacts can be interpreted as the meanings of communications from the source of the artifacts to a recipient.
  • the technology can receive information about the meanings of communications 51 and select artifacts 53 that will express the communications to the intended recipients.
  • articulation devices or simply, articulators 29 to capture the idea that the meaning is being articulated in artifacts so as to be understandable by the recipient.
  • the articulators can use a wide variety of information, in addition to the meanings of communication, in determining which artifacts 53 to select.
  • the additional information can include, for example, similar meanings of other communications of the same sender and related artifacts 55, similar meanings of other communications of other senders 57 and related artifacts, other meanings of other communications of the senders 59 and related artifacts, other meanings of other communications of other senders 61 and related artifacts, and supplemental information 63 provided by human beings.
  • the supplemental information 63 could provide information that associates the meanings of communications with artifacts that could be useful in articulating those meanings. George could provide data to the technology explaining that when he wants to send a
  • the communication can then be articulated in the form of a pre-recorded video of Fluffy that George would recognize as conveying the message, together with a caption on the video that says "I miss you want to come home?"
  • the video message can then be sent to George on his mobile phone.
  • the articulators 29 can include processes 65 that use the available information to infer, derive, or determine the appropriate artifacts.
  • the processes can include algorithms, inference engines, models, and a wide variety of other mathematical, logical, and other processes, and combinations of them.
  • the dog door has an electronic switch that unlocks the door and a light that can be illuminated to tell Fluffy the door is now open and he can go outside.
  • George hears Fluffy barking.
  • George picks up his cell phone, and launches the app provided by the technology, taps the option indicating that he wants to communicate to Fluffy by unlocking the dog door and turning on the light.
  • the articulators in this simple example, determine the meaning of the communication from George and then they select the best artifacts to articulate or express this communication to Fluffy.
  • the technology sends the commands of the electronic switch on the dog door, to open the switch, and after the switch is open, turns on the light to tell Fluffy that the dog door is open.
  • the meaning of the communication is "try to interest Fluffy in asking for and eating a CrunchyLunch biscuit," a communication that may have been received from the manufacturer of CrunchyLunch biscuits.
  • the articulators consider the time of day, the room where Fluffy is located, information about whether Fluffy has eaten recently, and historical information about the susceptibility of Fluffy to the communication in that context.
  • the articulators decide that the best way to motivate Fluffy is to show Fluffy a picture of a box of CrunchyLunch biscuits on a monitor in the living room where Fluffy is located, and to play the CrunchyLunch jingle on George's sound system in the living room.
  • non-custom movies of the kind that are sometimes shown to pets could include advertising illustrating CrunchyLunch biscuits, playing the CrunchyLunch jingle, and in that way motivating Fluffy to become interested in eating CrunchyLunch biscuits.
  • a wide variety of games could be played with animals using the technology and with or without a human being present at the location where the game is being played.
  • a command (that can be set to be triggered at a particular time or after a series of other events or presentation of digital content) can indicate to the animal that the animal is to find a treat or toy that had been previously hidden by the owner somewhere near the television (or other source of artifacts) or at any location in the house. The repetition of this command (and "game”) will train the animal to search for the treat or toy and return to the TV.
  • the ability of the technology through cameras or other sensors to recognize when the animal has found the treat or toy can trigger the capture and delivery of an image of the dog to the owner and a message congratulating the animal on the accomplishment ("good dog, good dog").
  • the technology could be used in a wide variety of ways to train animals remotely.
  • the technology 20 can be implemented in devices and networks that assist communications between two or more communicators 100, 102 located anywhere in the world (for example, Fluffy and George).
  • the technology 20 can be organized in a client-server model in which a host party 120 operates servers 116 that communicate through networks 106 with client devices 108, 110.
  • the networks can be any kind of local or wide area networks, public networks, dial-up telephone networks, wireless or wired networks, cellular telephone networks, the Internet, Wi-Fi, or any other kind of communication network that can carry information related to assisting animal communication.
  • the client devices 108, 110 can internally include or externally support (or both) interface features 112, 114 (for example, the devices 23, 25 of figure 1) that enable interactions 113, 115, with animals and humans or other communicators 102, 104.
  • Each of the client devices can be associated with one or more than one of the interface features.
  • the interactions 1 13, 1 15 can include a wide range of interactions such as artifacts of communication produced by one of the communicators and sensed, detected, or received through the interface features; a wide range of noises, sounds, images, video, odors, tactile sensations, flavors, and other stimuli that serve as artifacts of communication to be received and used by one of the communicators; and information provided by the communicators through user interfaces that are part of the interface features.
  • a communicator a human or an animal
  • the entered information may relate to communications between two communicators or can relate to the setting up, management, and operation of the technology (for example, a user creating a user account with the technology).
  • the servers 1 16 interact with the client devices to aid communications between communicators by having the servers send and receive information associated with the communications to and from the client devices.
  • the client devices in turn aid communications between communicators by sending and receiving information to and from the interface features.
  • processing of the information can then need to occur within the technology 20.
  • the processing can be divided in a wide variety of ways between the servers, the client devices, and the interface features.
  • the client devices may do very little other than pass information back and forth, while most of the processing effort is performed at the servers.
  • most of the processing could be done in the client devices with the servers simply passing processed information back and forth through the networks to and from the client devices. Other arrangements would also be possible.
  • the servers can be associated with databases 118 that contain a wide variety of information concerning the animals, the humans, the client devices, the interface features, behavioral information about animals and humans, information about communications, user files, account information, and others.
  • the information for the databases can be provided 124 from the communicators in the form of any sort of artifacts of communications or information associated with communications or with users of the technology.
  • Information for the databases can also be provided 119 from the servers based on processing of information that flows through the technology (for example, interpreted communications that are based on received artifacts can be stored for future use).
  • Information can also be provided from external sources 122. For example, information about the behavior, actions, history, interests, and communications of specific animals or humans or of animals or humans in general or with respect to groups, types, species, or categories of them, to name a few, can be provided.
  • the client devices could be, for example, any kind of device that is capable of providing or controlling or using the interactive features to conduct any of the interactions with any of the communicators and capable either directly or indirectly of communicating information with the servers.
  • the client devices can include, for example, computers, laptops, pad computers, mobile devices, mobile telephones, telephones, televisions, music systems, appropriately wired refrigerators, storage containers, doors or gates, pet houses, houses, automobiles, boats, kennels, and veterinary facilities to name a few.
  • the interface features can be provided through elements that are part of the client devices or by interface elements that are connected to, driven by, or controlled by the client devices.
  • the interface features can include, for example, loud speakers, headphones, or other sound producing features, microphones or other sound detectors, GPS features, vibrational or other tactile features, displays, screens, projectors, or other image or video displaying features, fragrance generators, odor generators, fragrance detectors, odor detectors, cameras, video cameras, image detectors, and other audiovisual features, switches, latches, locks, lights, fans and other wind creating devices, sunscreens, shades, and a wide variety of other input and output elements that can provide or receive stimuli, information, and other artifacts of communication to and from the animals or humans.
  • the interface features could be represented by an Internet browser running on a computer or handheld device or by a user interface provided by an app on a mobile phone or other handheld device.
  • the technology can be used to present highly customized and therefore much more interesting presentations to humans and animals. For example, George could select the video of himself from the technology, add a selected picture of himself, and add selected icons or symbols that Fluffy would understand, and have an aggregated presentation using those content elements delivered to Fluffy at any time of the day or night on any day of the year. George could also create a multimedia presentation made up of artifacts received from Fluffy and other elements for presentation to himself or to others.
  • the communications 22 between humans and animals that are assisted by the technology 20 can be simple and direct and require no interpretation.
  • Fluffy's barking in a certain way can be a produced artifact 19 of a communication that would be clearly understood by George without assistance or interpretation by the technology.
  • the interface features and client devices need only detect the artifact of Fluffy's communication and pass it through the networks to the server, which can then pass it back through the networks to other interface features and client devices associated with George.
  • a camera on a laptop computer in George's living room can capture video of Fluffy barking and the video can be passed to the server and then from the server to George's mobile phone. George may be able to understand what Fluffy is trying to communicate simply by watching the barking.
  • the artifacts 19 of a communication that a human or animal is trying to provide to the other may require interpretation by the interpreters 29 before the content of the
  • Fluffy paces back and forth at a given time of the day may indicate that Fluffy needs to be walked.
  • a video that captures Fluffy pacing could then be interpreted by the processes 43 as meaning that Fluffy needs to be walked.
  • the interpreted communication 31 could then be provided to George through another client device.
  • the technology 20 can assist communication between a human and animal by (1) providing a simple conduit for passing, storing, and delivering artifacts of communication from one to the other, (2) interpreting artifacts of communication produced by one of them and passing, storing, and delivering the interpreted communication to the other, (3) receiving a communication from one of them and articulating artifacts that represent the communication, and delivering the artifacts to the other, or (4) any combination of two or more of those.
  • artifacts as a communication, the articulation of artifacts based on a communication, or a combination of them, can be done by a wide variety of hardware, firmware, or software running on a wide variety of devices of the technology 20.
  • the artifacts or communications are passed to the server and the interpretation and articulation are done at the server.
  • the artifacts are interpreted or the communications articulated at the client devices and the interpreted communications or the articulated artifacts are passed to the server.
  • some of the processing can be done at client devices and some of the processing can be done at the server.
  • the databases associated with the server can store a wide variety of different kinds of information useful in the operation of the technology 20.
  • users of the technology can register account information about themselves and about animals, such as their pets.
  • the information can be updated from time to time and stored in user accounts.
  • a wide variety of information can be registered with the technology.
  • George could identify on an online mapping system, such as Google maps, the locations of all of the playgrounds that are nearby George's house and to which Fluffy enjoys going to visit with his dog buddies.
  • Other dog owners could do the same with respect to their dogs.
  • the technology could tell George that Muffy, a dog buddy of Fluffy, will be at the Lincoln playground at the corner of Main and Willoughby at three o'clock in the afternoon.
  • This message could be sent to George on his mobile phone at the office. George could then reply with a message to cause the technology to tell his daughter, who is at home, to take Fluffy to the Lincoln playground to visit with Muffy at 3 PM.
  • the technology could cause an artifact to be played on the sound system in George's house to attract Fluffy to watch the TV monitor. Then a picture of Muffy romping at the Lincoln playground, and previously captured, could be displayed to Fluffy. Fluffy could then step on a switch to indicate that she wants to go to the Lincoln playground to play with Muffy. This artifact would trigger an alert to George's daughter, who could then take Fluffy to the Lincoln playground.
  • a dog or cat could be attracted to a location to initiate interaction by using any kind of device or mechanism that can provide an artifact that is attractive to the animal.
  • the artifact could be an odor as animals are often highly sensitive to odors and attracted by them.
  • a device or mechanism capable of emitting an odor in response to a command could be provided. (See for example the discussion at http://en.wikipedia.org/wiki/Digital_scent_technology.) By coordinating the emitting of the odor with, for example, the beginning of a video presentation, the animal could be attracted to watch the video presentation.
  • the device or mechanism could be one that provides vibrations or other tactile artifacts in response to a command.
  • Animals are often sensitive to and attracted by sounds and physical sensations.
  • the initiation of the associated video (or any other artifact or communication to be presented to the animal) could be delayed until a sensor determines that the animal is in the vicinity of the television or other device that is presenting the artifacts. That is, the sequence could be first to command the release of the odor or vibration, next to monitor for the presence of the animal, and finally to initiate the presentation of the artifacts.
  • the databases can store information about and examples of behavior, size, age, species, ownership, location, favorite foods, relationships and friendships among different animals, and associations of animals with humans, among other things.
  • the databases can include information about communications and artifacts of communication associated with individual animals or groups of animals and individual humans or groups of humans.
  • the artifacts can be associated in the database with interpreted communications that relate to the artifacts.
  • the database can store communications that might occur or that the user might wish to occur and could associate those communications with artifacts that can be used to convey them.
  • a standard communication that George might want to convey to Fluffy would be "You may now go out of the house by stepping on the switch that opens the gate at the front door.”
  • the related artifact might be turning on a signal light on the gate.
  • the database could store the communication and associate it with an action to be performed by a client device, namely turning on the light.
  • George might choose, through an interface on a telephone, an entry that said "Fluffy, it's okay to go out of the house.”
  • the server could use the database to associate it with the action of turning on the light.
  • the server would then send an instruction to a client device at the house causing the light to be turned on.
  • Fluffy may miss George and want to see live streaming video of George at work. Fluffy could step on the switch three times as an artifact to signal this message. George, at work, could turn on the camera of his mobile phone and allow it to capture video which would then be streamed to the television and George's living room for Fluffy to watch.
  • the technology 30 can be used in a wide variety of situations to assist humans to communicate with animals. We describe several additional examples below, but these are merely a few examples of thousands of possible cases.
  • a cat owner is at work. The owner has left her cat, Buffy, at home alone for the day. At lunchtime, the cat owner is wondering how Buffy is faring.
  • the cat owner uses an app on her mobile phone to connect to the server of the technology 20.
  • the app displays a user interface screen on which the cat owner can select "Check in on Buffy.”
  • the server then sends instructions to a client device in the form of a laptop sitting on the dresser in the room where Buffy is spending the day.
  • the instructions cause the laptop to turn on the camera of the laptop and begin to stream video of Buffy to the server, which then streams it to the owner's phone.
  • the video shows that Buffy is running around in a circle licking her lips.
  • the interpreter at the server determines that Buffy is hungry.
  • the server causes the owner's mobile phone to display a message underneath the streaming video that says "Buffy is hungry.” This is the interpreted message from the artifact of Buffy running around in a circle and licking her lips. The message confirms to the owner the owner's guess that Buffy is hungry based on watching the streaming video.
  • the server can send an instruction to the owner's mobile telephone to cause it to provide a dialogue that asks "Do you want Buffy to be fed?" If the owner's replies that she does, that communication is articulated at the server, using the database, into artifacts to be executed to enable Buffy to be fed. For example, the server can send a video back to the laptop to be displayed to Buffy that displays a symbol or a video of the owner or some other artifact that Buffy has come to know as the signal that she can feed herself from a storage container in the room. In this example, Buffy has been trained or preconditioned to behave in a certain way when certain artifacts are presented to her from client devices.
  • Buffy then goes to the storage container on the floor of the room which has now been conditioned by an electronic switch to be accessible. Buffy opens the storage container and eats the food.
  • the laptop can stream video to the server and the server can stream the video to the owner's mobile phone showing that Buffy is eating.
  • the technology that we have described can provide a universal system or platform 202 that enables the teaching of an animal 204 through the animal's senses 206 (vision, hearing, or smell, for example) to enable the animal to learn vocabulary 208, to learn to respond to commands 210, and to learn about "life" 212, among a wide variety of other lessons.
  • the animal can be taught sign language 214, and deaf or blind animals can be taught to use other senses.
  • training or teaching 182 can include presenting information 184 to the animal 186, determining a reaction or behavior 188 of the animal that suggests that what is being trained or taught has been learned or partially learned, and providing feedback (positive or negative) or rewards or punishment (or both) 190 when the determined reaction or behavior meets a threshold.
  • the presentation of the information in the course of providing feedback to the animal, including rewards or punishment is done using artifacts presented through interface features of client devices 218.
  • the determining of the reaction or behavior the animal is done by sending and receiving artifacts from the animal through interface features of client devices 218.
  • the word or phrase can be presented to the animal in the form of speech of an owner or handler or other human who is associated with the animal, or speech of a human who is not associated with the animal.
  • the speech can be presented through an audio or video device, including a microphone, a television, or a display screen of a computing device, for example, or a wide variety of other devices 218.
  • the speech may have been pre-recorded or may be delivered in real-time from a remote location where the human is located.
  • the word or phrase can be presented in other than speech form, for example, in writing, or using a token or icon or other graphical representation.
  • the non-speech presentation could occur on a display, a light matrix, or any of a variety of other text and icon presentation devices.
  • the process of determining the reaction or behavior of the animal in response to the presentation of the information includes collecting information through the client devices about the behavior or actions 207 of the animal.
  • the behavior and actions can include motions, fragrances, noises, and bodily fluids, for example, in a wide variety of other artifacts. These can be detected using cameras, microphones, fragrance detectors, and substance detectors as part of the client devices.
  • the platform can analyze it using sophisticated mathematical techniques including classifiers, predictive modeling, statistical analysis, signal processing, and others. The analysis can take advantage of stored information that associates animal behavior and actions with meanings, as discussed earlier.
  • the feedback presented to the animal can take a variety of forms, for example, sounds, images, video, food, access to play things, lights, access to locations, and a wide variety of other feedback.
  • the platform 202 controls and executes the delivery of the training, which can be done on a predetermined schedule, or at times triggered by behavior of the animal, or at times triggered by a human or another animal.
  • Training can proceed automatically and repetitively under the control of the platform 202.
  • Automatic training can take advantage of recorded audio, video, text, graphics, images, and other kinds of artifacts that are stored on the platform 202. These artifacts can be generated and stored from material provided by a human who is associated with a particular animal, such as an owner of a, or they can be generated using professional speakers, actors, graphic designers, writers, and others in stored for use with several or a large number of animals.
  • a human 221 can play a role in the training by providing artifacts such as speech, actions, typed input, commands, and images, for example, that represent training information, determinations made by the human about the reactions and behavior of the animal, and feedback 207, and the control of the platform 202 to do any of those things.
  • artifacts such as speech, actions, typed input, commands, and images, for example, that represent training information, determinations made by the human about the reactions and behavior of the animal, and feedback 207, and the control of the platform 202 to do any of those things.
  • the human can also can receive information associated with the training of the animal. For example, the human can receive information indicating that training is proceeding, information about the specific training being given, information about feedback being given to the animal, and information about the success of the training, audio, video, and other artifacts that represent the training (for example, a pet owner can receive video on his cell phone showing his pet being trained to understand the word "ball"). All of this information can be delivered through client devices 219.
  • the platform 202 can also be used to teach games to an animal, to enable the animal to play the games either alone or with other animals and humans either at the same location or at other locations, or to play a game with one or more animals directly without the involvement of a human.
  • games broadly to include, for example, any activity of an animal that involves active participation of the animal and is entertaining to the animal and, in some cases, has a goal or objective that the animal is to achieve.
  • the game is one performed alone by the animal.
  • the game can involve interaction with one or more humans or with one or more other animals.
  • games 226 can involve back and forth interaction 228 between and among players 230, 232 (humans and animals), or can involve individual play by an animal, or can involve play or other interaction 234, 235 between the platform 236 and one or more players, or can involve all of these.
  • games are played without any external artifacts or devices, for example, a simple game in which a dog runs successfully to four corners of a playing field.
  • game elements 238 such as balls, courts, tokens, bones, water, and a wide variety of other elements.
  • technology 20 can provide a universal system or platform 302 that enables the playing of games by one or more players 304, 306, who can be at the same location or at different locations and can interact directly at a given location or indirectly through the platform or a combination of the two.
  • the technology includes client devices, user interfaces, and artifacts sent and received, 318, 320, with respect to each of the players. Through the devices, interfaces, and artifacts, the platform can perform a number of operations with respect to the playing of the game.
  • These include teaching 308 players about the game including its rules and rewards, operating as a participant in the game 310, keeping score or otherwise tracking activity and success in the game 312, providing rewards and other feedback 314, and reporting 316 to other players and to humans related to animals involved in the game on any aspect of the game. These activities can be effected in the form of actions, perceptions, and reactions 307, among other things.
  • activity broadly to include, for example, any conduct, performance, or action that engages the animal for a period of time.
  • An activity may include a productive activity, such as arts and crafts, painting, or music, that yields a product or an outcome, for example.
  • An activity could also involve a theoretically unproductive activity, such as sleeping. Activities can also include, in the broadest sense, the training, games, and entertainment that we describe elsewhere.
  • the activity can be engaged in by one or more animals 428 with or without the participation of one or more humans 430.
  • One or more of the animals in one or more of the humans may all be located at the same place or they may be located at different places and cooperate through the platform 432.
  • Tools and materials 434 can be provided at one or more of the locations.
  • the platform can provide a variety of features with respect to arts and crafts activities, including teaching the animal or human about the game; releasing or activating tools or materials to be available to the animal for use in the activity; capturing audio, video, images, or graphics that result from the arts and crafts activity of the animal; providing rewards, praise, or other feedback to the animal; serving as an intermediary between the animal and other animals or humans with respect to the arts and crafts activity, and others.
  • Functions, operations, and devices analogous to those shown in figures 6 and 9 could be used in such activities.
  • entertainment in its broadest ordinary sense. As shown in figure 11 , entertainment 502 of an animal 504 can be facilitated by the platform 506 (using technology 20 in a mode analogous to the ones shown in figures 6 and 9).
  • the platform can notify the animal when the entertainment is available, determine whether the animal is available for the entertainment, capture audio, video, images, graphics, odors, and other artifacts while the animal is receiving the entertainment, report to one or more others animals or to one or more humans 510 about the entertainment or the animal's reaction to the entertainment, and a wide variety of other actions.
  • the subject matter of the teaching, training or learning can be any of a wide variety of subjects.
  • the animal can be taught skills or knowledge or information or relationships, for example.
  • Fluffy could be taught vocabulary. It is believed that dogs are capable of learning vocabulary that includes as many as hundreds of "words” and concepts. The words can involve things, actions, and other kinds of vocabulary. Typical vocabulary words include come, sit, bring, stay, drink. Concepts could, among a wide variety, "Do you want to take a walk?", “Bring the ball”, “Do you want a treat?”, "Do you want to watch TV?”
  • the technology could be used to teach vocabulary in a way that is useful and effective and can be applied even when the communicator who is training the animal is at a different location or when there is no human communicator engaged in the training.
  • teaching of vocabulary and any other kind of teaching or training can be done by the technology itself without the participation of a human.
  • the technology can present artifacts to Fluffy that could include the presentation of images of a ball or of many different kinds of balls through a television, a tablet computer display, a cell phone, or any other kind of client device.
  • the artifacts could also include a real ball to which Fluffy is given access, for example, by opening an electrically controlled during the box that contains the ball.
  • the artifact could also include the presentation of the word ball, and an audio or video recording of a human speaking the word ball, either alone or in the context of the sentence.
  • the client device that is presenting the artifacts can be controlled by the servers to present the artifacts repeatedly in one or more sessions so that Fluffy comes to associate the spoken word ball or the written word ball or both of them with a ball.
  • the technology could acquire video or audio clips of Fluffy during the course of the training and provide the video or audio in real time to a human at a remote location or could store the clips for later use. This can allow a human trainer to determine whether Fluffy is responding in a way that indicates that Fluffy is learning to understand the meaning of the word ball. In some cases, the technology can analyze Fluffy's behavior and reaction to the training instances as a way to determine whether Fluffy is learning the meaning of the word ball.
  • the teaching of the word ball could involve a much more active participation of a human, such as Fluffy's owner, George.
  • the artifacts presented to Fluffy could be presented in real-time from a remote location. George could release the door of the box containing the ball by tapping on a control presented on the screen of a handheld device. George could watch Fluffy's reaction in real-time from the remote location on a screen of the client device.
  • Fluffy's reaction to the training may include barking or motions that Fluffy begins to use to represent his articulation of the word ball.
  • the technology may analyze audio and video of the timing, volume, frequency, and other characteristics of the barking and the timing, speed, and trajectory of the motions in order to determine whether and if so what noises and motions Fluffy uses to express the word ball. Later, this information can be used to respond to Fluffy, for example, releasing the ball from the box when Fluffy barks in the way that means ball.
  • the training can also be of commands or instructions or directives or more generally sentences that include nouns and verbs.
  • George may want to teach Fluffy (or to have the technology teach Fluffy) the meaning of the command "find the ball”.
  • the training could be done by repeatedly following a sequence each iteration of which includes some or all of the following steps: the ball is placed or hidden at a location different from the location of a client device that Fluffy normally watches, for example, George may have placed the ball next to the washing machine in the laundry while the computer display that Fluffy often watches is in the dining room down the hall. George may have placed a number of balls in different location so that when he is away from the house, a series of iterations of the training exercise can be conducted.
  • the training may be started automatically by the technology or manually by George taking some action on a client device at a remote location. For example, George may be on a business trip and when he wakes up in his hotel room he might pick up his cell phone, open the technology's app, and tap the button that says “train Fluffy” and then the button that says “find the ball”.
  • the technology uses a camera on a client device to watch for the presence of Fluffy. Once the technology determines that Fluffy is in front of the display, it can begin to present an artifact in the form of a real-time audio or video clip of George or someone else saying "find the ball, Fluffy". The technology can then determine if Fluffy has departed, presumably to hunt for the ball. The video can be presented in real-time to George in his hotel room or stored for later playback. If Fluffy finds one of the balls and brings it back to a place where the camera can capture Fluffy, the technology can use image recognition techniques to confirm that Fluffy has brought the ball back.
  • the technology can automatically (or George can manually) then release the door of a box that contains a treat as a reward for Fluffy.
  • the training exercise can be repeated many times, but typically would not be repeated too often in a single day in order not to provide Fluffy with too many treats.
  • An animal also can be taught lessons about its life as an animal. These lessons could include lessons about dangers, risks, pleasures, relationships, development, aging, death, health, grooming, sex, the environment, time and the passage of time, emotions, and a wide range of other matters of the kind that relate to living as an animal.
  • the animal can be taught about hazards such as being followed or attacked or bitten by a larger animal or a predator, the risks of man-made devices such as cars in streets, doors that have been unintentionally left open and therefore can allow the animal to run away, and others.
  • the training can be done by presenting artifacts to the animal that illustrate the hazard in a way that would be recognizable are understandable by the animal.
  • George could have the technology presents video and audio examples that show dogs like Fluffy or even spliced- in images and noises of Fluffy himself running into a street, being hit by a car, and feeling injury. Cameras and microphones can capture the reaction Fluffy to the presentations to determine whether the lesson is being understood and learned. In some cases when hazards have been taught, it would be possible for the technology to thereafter present a staged opportunity for the animal to put itself into the hazardous situation and the animal to be rewarded for avoiding it.
  • an electrically operated latch on the door could be opened by the technology to allow the door to swing partly open as a way to test Fluffy to see whether he will leave the house and be exposed to the risk that would involve. If Fluffy recognizes the hazard and refrain from leaving the house, a reward could be presented to Fluffy by opening the door of a box containing a treat.
  • deaf dogs can learn sign language.
  • the technology offers a platform for training a deaf dog in understanding sign language.
  • Artifacts that can be used in this training could include videos of human beings making the signs associated with vocabulary, concepts, emotions, sentences, and other meanings.
  • the artifacts could also include videos, audio, images, and other artifacts that represent the concept or meaning that is being signed.
  • Fluffy has a deaf sister, Puffy, owned by Zelda.
  • Zelda wants to teach puffy to understand sign language for "ball”.
  • Zelda could prerecord video clips showing herself making the sign for ball. The clip could then be used as one of the artifacts presented to Puffy each morning after Zelda goes to school as part of the sign language training.
  • the technology could use signed language videos created by models rather than by pet owners themselves.
  • the artifacts could include live or recorded language spoken by owners of the animals or other humans.
  • the artifacts would also include representations of the things or actions associated with the spoken language, representations that could be heard or otherwise understood by the animals. For example, suppose that Fluffy has a blind brother, Duffy owned by Igor. If Igor wanted to train Duffy to understand the word ball, Igor could have the technology repeatedly play the spoken word ball for Duffy while releasing a ball from the electrically controlled box.
  • the training will work better when the artifacts include audio and video clips from the animal's owner and audio and video clips of objects and actions and environmental conditions that are familiar to the animal.
  • the technology can enable a wide variety of games to be played by the animal itself, the animal with other animals, the animal with one or more humans, and multiple animals with one or more humans.
  • a simple example could be a game in which Fluffy raises one of its paws and touches an object that contains a sensor that is part of one of the client devices.
  • Fluffy would be taught how to play the game, for example, by rewarding him when he successfully touches the object. Assuming that Fluffy thinks this is a fun game, in the future Fluffy could play the game without requiring a reward other than perhaps the ringing of a bell or some ephemeral award.
  • Fluffy video, audio, images
  • the technology would activate the sensor and condition it to be ready for Fluffy to touch.
  • Fluffy could simply begin by touching his paw to the sensor and the ephemeral rewards such as a bell ringing would be the result. Fluffy could continue to do this repeatedly until he got tired of the game.
  • the technology could determine that he had gotten tired by observing that he had left the room or had stopped playing the game. Video of Fluffy playing the game can be captured and streamed in real-time to remote location where George is playing golf, or can be stored for later use.
  • the broad range of games that could be handled by the technology include games that involve objects such as balls, targets, game boards, playing fields, rewards, scoring, teams, and a wide variety of other features.
  • another game could be, in effect, "kick the ball when it comes by.”
  • Some games can involve play between two animals, for example, Fluffy and Puffy, located at two different locations and played by each of them interacting with client devices at its own location.
  • the game could involve playing with two identical balls, one at each location. Each dog could see the other playing with its ball and the play could proceed according to rules that involve both of the dogs in both of the balls.
  • animals such as dogs could be provided with a platform similar to Facebook that enables dogs to become friends and to keep up with the activities of one another.
  • Postings on behalf of each dog could be done automatically by the technology using video, audio, images, and other communications that capture activities of the dog, "speech" of the dog, and other information known to the technology.
  • Each of the dogs could enjoy watching the postings of other dogs, including its "friends" and stranger dogs.
  • the dog may be taught to provide its own postings, and to "like” and otherwise reactive the postings of other dogs.
  • an owner or other human associated with the dog could do the postings and interpret them for a dog.
  • Fluffy could be taught to paint using real paint on a real canvas laid on the ground. Demonstrations of how to do this could be presented for the technology and client devices and feedback and rewards can also be provided to Fluffy.
  • a very large canvas could be provided for this purpose by George or another human being.
  • a wide range of lessons and trainings could be provided to increase the skill of Fluffy in painting.
  • electronically sensitive surfaces could be used in association with displays (either separate from the sensitive surfaces or layered below them) to enable Fluffy to paint electronically. To paint, Fluffy could walk on the sensitive surface or touch it in various places or lick it to indicate where color is to be placed on the drawing. All of the steps in this activity could be observed by George at a remote location or locally.
  • Fluffy could be taught to perform a kind of music.
  • Fluffy could make noises, bark, engage in motions, and otherwise communicate with one of the client devices to indicate sequences of sounds that Fluffy intend to be part of the music he is creating.
  • the technology could interpret each of the sounds, motions, and other communications, either individually or in combinations as representing sounds of selected pitches, volumes, and tonality.
  • the technology could play it's interpreted sounds back to Fluffy or George or another animal or another human.
  • Fluffy could learn which of his communications is being interpreted as which sound and in that way become more active and effective in creating music.
  • the same approach could be used with multiple animals.
  • Fluffy and Puffy and Duffy could together make music either from a single location, or from multiple locations.
  • the activities hosted by the technology could also include e-mail communication from and to an animal.
  • the technology would receive artifacts from the animal through the client devices, interpret them as communications, and use an e-mail server to formulate and send an e-mail based on the artifacts.
  • the e-mail can be sent always to a predetermined human being, such as the owner of a pet. In some cases, for example, Fluffy could be taught how to address his e-mail to one of his animal friends or to a human.
  • Fluffy could be taught how to address his e-mail to one of his animal friends or to a human.
  • the text and images of the e-mail can be interpreted or used directly to convey the content of the e-mail to Fluffy and indicate who originated the e-mail.
  • Fluffy could be provided with physical therapy using the techniques described earlier following an accident. If artifacts of Fluffy detected by and analyzed by the technology indicate that Fluffy is hyperactive or upset about something, the technology could alert Fluffy's veterinarian. The artifacts could be presented to the veterinarian with a request for a prescription for Prozac. The veterinarian could write a prescription and have the medicine delivered to George's house for Fluffy. In some cases, manufacturers of
  • a wide variety of the kinds of features that we have discussed here, and others, can be used in conjunction with broadcast and cable presentations of video and other content to animals, such as channels devoted to showing content to pets. Such content is typically presented passively to the animal.
  • the features that we have discussed here, and others, can provide a sample or very rich interactive component that supplements or enhances the content being presented.
  • An example of a technology that would enable more active interaction with an animal either separately or in conjunction with other sources of content being presented to the animal is the TV dongle available from Google, called Chromecast.
  • the dongle can be attached to a video input port of a television, e.g., the HDMI port.
  • the dongle then can connect wirelessly through a local Wi-Fi network to the Internet.
  • Local other devices such as mobile telephones and other mobile devices
  • a remote device such as a mobile phone of an owner of a pet who is located in a different place, can communicate through the Internet to the dongle.
  • a camera in the room can capture the behavior of the dog.
  • the video can be streamed to the dog owner's mobile device in another location.
  • the owner can decide to interact with the dog. For example, the owner can then stream his voice or locally captured video of him, through the mobile phone and then through the dongle to cause the video or voice to be played to the dog, for example, as an interruption to the a channel devoted to showing content to pets.
  • the interaction can include commands given by the owner to the dog, gameplaying between the owner and the dog, and a very wide variety of other activities.
  • the technology and platforms that we have described, a wide variety of interactions among animals and between animals and humans is possible.
  • the technology and platforms enable much richer, continuous, intense, pleasant, and productive relationships to be established, developed, and grown between individual humans and animals with which they are associated.
  • the technology and platforms can give great pleasure to the owner of the pet, and to the pet and can significantly improve the knowledge, behavior, and happiness of the animal and the human.
  • the animals need not be dogs or cats or even pets. Any animal capable of some level of communication could be involved.
  • the human and animal need not be in different locations and need not be unable to see or hear each other in some implementations. They can be in the same place and able to see or hear each other and still be benefited by assisted communication of the kind that we describe.
  • the assisted communication can be between one animal and another animal without a human involved.
  • Data analytics can be used in facilitating, operating, and using the games, activities, entertainment, training, or teaching.
  • Data analytics can be used to analyze data related to artifacts acquired from animals and humans and information about populations of animals and humans, and information available on the Internet or from databases about the behavior of individual humans and animals or groups of them that may bear on the game, activities, entertainment, training, or teaching.
  • an activity center 200 provides an animal 202 (e.g., Fluffy) with access to one or more types of activities.
  • the activity center 200 can be an electronic device, such as a large-size tablet that displays or otherwise presents content, such as images, video, audio, or other content on a display 204.
  • a lateral dimension (e.g., a length or width) of the activity center 200 can be at least about 12 inches, e.g., at least about 24 inches, at least about 36 inches, at least about 48 inches, or another size.
  • the display 204 can be display having a touch sensitive surface. By a touch sensitive surface, we mean an electronic surface whose operation can be controlled by touch.
  • the activity center 200 can be placed on the floor 201 such that Fluffy 202 can walk, run, or stand on the activity center 200 and can watch the content that is displayed or both at the same time, thus interacting with the touch-sensitive display.
  • Fluffy 202 can select the type of content that is displayed on the activity center by interacting with the touch-sensitive display, e.g., by interacting with a particular section of the display in order to activate a certain type of content, as discussed below.
  • the activity center 200 can keep Fluffy 202 occupied and stimulated while its owner 208 (e.g., George) is unavailable or out of the house.
  • an activity center 250 includes a touch sensitive horizontal surface 252 that Fluffy can walk, run, or stand on that can be placed on the floor 201.
  • the activity center 250 also includes a display 254 that is separate from the touch sensitive surface.
  • the display 254 can be positioned vertically (e.g., hung on a wall 256 or standing on a table) and near, such as adjacent, to the touch-sensitive surface 252.
  • the display 254 can be, e.g., a computer monitor, a television, or another type of display.
  • the activity center 200 can include electronics and communication capabilities to send messages or other information 209 to George 208 representative of artifacts of communication from Fluffy, such as recordings of Fluffy' s barking or other noises, still images, audio, or video images of Fluffy interacting with the activity center, artwork created by Fluffy, or other artifacts of communication, as described in greater detail below.
  • Fluffy's noises can be received by a microphone 216. Images can be captured by one or more cameras (e.g., cameras 300, 306 shown in Fig. 15).
  • the activity center 200 can also receive messages and other information or commands 207 from George 208 representative of artifacts of communication from George to Fluffy, such as still images, audio, or video images of George, instructions to the activity center about actions to be taken, content to be captured, or content to be presented.
  • the activity center can include speakers 214 to play sounds.
  • the activity center 200 can incorporate one or more communications devices 210 to enable the activity center 200 to receive content and other information and commands, transmit content and other information, or both.
  • the communications devices 210 can include devices that enable the activity center 200 to connect to, e.g., the Internet (through a wired or wireless connection), a cellular telephone network, or another type of communications network.
  • the communications devices 210 can enable the messages (we sometimes use the term messages broadly to include, for example, messages, instructions, commands, or other information) 207, 209 to be sent to or from George or Fluffy.
  • the messages we sometimes use the term messages broadly to include, for example, messages, instructions, commands, or other information
  • communications devices 210 can include devices that enable the activity center 200 to receive radio signals, broadcast or cable television signals, streaming video or music (e.g., from
  • the communications devices 210 can include a ChromecastTM device or another type of streaming device.
  • the communications devices 210 can include devices that enable the activity center 200 to communicate with a client device 211 , such as a computer, laptop, pad computer, mobile device, mobile telephone, telephone, television, music system, appropriately wired refrigerator, storage container, door or gate, pet house, house, automobile, boat, kennel, or and veterinary facility to name a few.
  • a client device 211 such as a computer, laptop, pad computer, mobile device, mobile telephone, telephone, television, music system, appropriately wired refrigerator, storage container, door or gate, pet house, house, automobile, boat, kennel, or and veterinary facility to name a few.
  • communications devices 210 can include BluetoothTM-enabled devices.
  • the display 204 can be arranged in one or more sections, each section or groups of sections providing a different activity for Fluffy 202.
  • the display 204 is divided into six sections 206a-206f (which we sometimes refer to collectively as sections 206); however, the display 204 can be divided into more or fewer sections or groups of sections.
  • the activities accessible to Fluffy 202 through the sections 206 of the display 204 can include, e.g., animal television, animal social media, owner resources, arts activities, music activities, play, training, or other activities, or a combination of any two or more of them.
  • section 206a provides animal television
  • section 206b provides animal social media
  • section 206c provides owner resources
  • section 206d provides arts and crafts activities
  • section 206e provides music
  • section 206f provides play.
  • Each section 206 can be activated by a selection, e.g., a touch on the section, e.g., by Fluffy or George.
  • Fluffy 202 can activate a particular section 206 of the display 204 by touching the section 206, e.g., by stepping or walking on the section 206 (e.g., anywhere on the section 206 or on a particular part of the section 206), tapping her paw on the section 206, scratching the section 206, licking the section 206, or touching the section 206 in another way, or a combination of any two or more of them. For instance, if Fluffy 202 wants to watch television, she can walk onto section 206a; if Fluffy 202 wants to create a work of art, she can walk onto section 206d.
  • Fluffy 202 can activate a particular section 206 by interacting with another device, such as by pushing a button remote from the activity center 200, interacting with another computing device, or in another way.
  • George can remotely control which section 206 of the display 204 is activated, e.g., by sending a message to the activity center 200, by accessing a user interface on another device (e.g., a personal computer or a mobile computing device), by remotely accessing a control interface of the activity center 200 using another device (e.g., a personal computer or a mobile computing device) that communicates with the activity center 200, or in another way.
  • the number of sections 206 is set by default, e.g., by the software of the activity center 200.
  • the sections can be square or rectangular but need not be square or rectangular and can be other shapes.
  • a single, large display 204 is arranged electronically into sections 206, with each section corresponding to a region of the display 204.
  • the activity center can be assembled from multiple, smaller displays, each of which corresponds to a section 206.
  • the particular section 206 in which each activity is located can be fixed so that Fluffy 202 can learn which section of the display 204 to touch in order to activate a desired activity.
  • George 208 can specify the number of sections 206, the activity provided by each section, or both, e.g., by selecting one or more of the activities offered by activity center 200 to which George 208 wants Fluffy 202 to have access.
  • George can specify a time of day at which a particular activity can be provided. For instance, George may want Fluffy to have access only to active activities, such as arts and crafts, music, play, or training, in the morning, and may want Fluffy to have access to more passive activities, such as animal television or animal social media, in the afternoon.
  • George can specify activities to activate in real time or can specify a schedule of activities, e.g., by sending a message to the activity center 200, by accessing a user interface on another device, by remotely accessing a control interface of the activity center 200 using another device, or in another way.
  • animal television is turned on.
  • Animal television can be a broadcast or cable television channel or streaming content that includes content that is geared toward animals, such as television programming for dogs.
  • the programming is displayed directed on the display 204.
  • Fluffy's selection of section 206a causes the activity center 200 to send a command to a television 212 or another display device such that the programming can be displayed on the television 212 or other display device.
  • animal social media can include video, images, or sounds of Fluffy, Fluffy's friends (e.g., other animals Fluffy may recognize, such as a neighbor's dog), George, other people Fluffy may recognize (e.g., other family members or neighbors), Fluffy's favorite places, or other content, or a combination of any two or more of them.
  • the social media content is displayed directed on the display 204.
  • Fluffy's selection of section 206a causes the activity center 200 to send a command to the television 212 or another display device such that the social media content can be displayed on the television 212 or other display device.
  • Owner resources can include content geared toward an animal owner, such as radio or television programming geared toward an animal owner, training tips, notifications of local special events, coupons, advertisements for animal supply stores, information about medications or veterinarians, or other content geared toward an animal owner, or a combination of any two or more of them.
  • the owner resources are displayed directly on the display 204.
  • Fluffy's selection of section 206a causes the activity center 200 to send a command to the television 212 or another display device such that the owner resources can be displayed on the television 212 or other display device.
  • the owner resources displayed in section 206c can be related to content that Fluffy is or has interacted with on the display 204. For instance, if Fluffy repeatedly watched a video of a dog bone earlier in the day, an advertisement for dog bones can be displayed to George when the owner resources section 206c is activated.
  • Fluffy 202 steps on or otherwise selects section 206d (in this example), arts and crafts activities, such as electronic painting, are activated. For instance, to paint, Fluffy could walk on, pat, lick, scratch, or otherwise touch the display 204 to create an electronic painting. In some examples, Fluffy's painting can be formed based on the locations of Fluffy's touch such that the painting is a trail of where Fluffy walked, patted, licked, scratched, or otherwise touched the display.
  • the painting can also reflect pressure (e.g., how hard Fluffy touched the display), type of motion (e.g., whether the touch was a walk, pat, lick, scratch, or another type of touch), speed (e.g., whether Fluffy was sluggish or moving quickly), or other information about Fluffy's interaction with the display 204.
  • pressure e.g., how hard Fluffy touched the display
  • type of motion e.g., whether the touch was a walk, pat, lick, scratch, or another type of touch
  • speed e.g., whether Fluffy was sluggish or moving quickly
  • pressure, type of motion, speed, or other information can be reflected in colors (e.g., shade or brightness of color, e.g., greater pressure can be depicted as a brighter color, each type of motion can be depicted as a different color), dimensionality (e.g., a three-dimensional representation of Fluffy's painting can reflect pressure in the height of the three-dimensional features), or in another way.
  • Fluffy's painting can be a single color or multicolored.
  • the colors of Fluffy's painting can be randomly assigned, assigned by position within the art activity section 206d, assigned based on Fluffy's interaction with the display 204 as discussed above, other assigned in another way.
  • Fluffy can paint on the display 204 and the painting can be displayed elsewhere, such as on another local computing device, the television, or George's personal computer, or on another display.
  • the activity center 200 can save Fluffy's work of art to a local memory 218 or a remote storage 220 (e.g., George's personal computer, cloud-based storage, or another type of remote storage). For instance, the activity center 200 can save Fluffy's work of art after Fluffy has worked on the art activity for a certain amount of time (e.g., 2 minutes, 5 minutes, 10 minutes, or another amount of time), after the section 206d is a certain percentage filled with Fluffy's painting (e.g., 50%, 60%, 70%, 80%, 90%, or 100%, or another amount), or based on one or more other criteria.
  • a certain amount of time e.g., 2 minutes, 5 minutes, 10 minutes, or another amount of time
  • Fluffy's painting e.g., 50%, 60%, 70%, 80%, 90%, or 100%, or another amount
  • the activity center 200 can send George a digital representation of Fluffy's work of art, such as a picture file, or a link to the digital representation, such as a link to a public or private remote storage.
  • the activity center 200 can send a message (e.g., an email message, a short message service (SMS) message, a voice mail message, a social network message, or another type of message) to George with the digital representation or the link.
  • a message e.g., an email message, a short message service (SMS) message, a voice mail message, a social network message, or another type of message
  • the music activities can enable Fluffy to perform a kind of music. For instance, Fluffy could make noises, bark, or make other sounds that are received by the microphone 216.
  • the activity center can interpret each of Fluffy's sounds, either individually or in combinations as representing sounds of selected pitches, volumes, and tonality, as described above, to create music.
  • Fluffy's actual sounds e.g., barking, growling, or other dog sounds
  • the pre-recorded song can be selected randomly, selected based on the time of day (e.g., a high-energy song in the morning, a soothing song in the evening), selected based on Fluffy's mood as determined by her sounds or her level of interaction with the activity center 200 or both, or selected based on another criteria.
  • Other sounds can also be superimposed over the pre-recorded song along with Fluffy's sounds, such as George's voice, sounds from other animals (e.g., Fluffy's friends), ambient noise that Fluffy may recognize, or other sounds.
  • Fluffy's "music" can be played back automatically or according to an action by Fluffy, such as when Fluffy walks on a certain part of the display 204 or when Fluffy does a certain type of interaction with the display 204, such as a lick, tap, scratch, or other type of interaction.
  • Fluffy's sounds, the pre-recorded song, or other sounds can be played by the speakers 214 of the activity center 200 or by speakers of another device, such as another local computing device, the television, or George's personal computer, or by speakers of another device.
  • the activity center 200 can save Fluffy' s work of music to local memory 218 or to remote storage 220. For instance, the activity center 200 can save Fluffy's work of music after Fluffy has worked on the music activity for a certain amount of time (e.g., 2 minutes, 5 minutes, 10 minutes, or another amount of time) or based on one or more other criteria.
  • the activity center 200 can send George a digital representation of Fluffy's work of music, such as an audio file, or a link to the digital representation, such as a link to a public or private remote storage. For instance, the activity center 200 can send a message (e.g., an email message, a short message service (SMS) message, a voice mail message, a social network message, or another type of message) to George with the digital representation or the link.
  • a message e.g., an email message, a short message service (SMS) message, a voice mail message, a social network message, or another type of message
  • Play activities can include, e.g., the games described above.
  • Training activities (which are not provided by a section in this example) can include, e.g., the training activities described above, such as teaching vocabulary, life lessons, or other training topics.
  • one or more activities can be provided by default when Fluffy is not actively interacting with the activity center 200. For instance, when Fluffy has not touched the activity center 200 for more than a certain amount of time (e.g., 5 minutes, 10 minutes, 15 minutes, or another amount of time), one of the activities can be activated, such as the animal television activity, the animal social media activity, or another activity. When Fluffy next touches the activity center 200, the default activity is stopped and the activity selected by Fluffy is activated.
  • a certain amount of time e.g., 5 minutes, 10 minutes, 15 minutes, or another amount of time
  • the activity center 200 can include a treat door 222 that can be opened to give Fluffy 202 access to a treat 224 stored behind the treat door 222 (e.g., in a hollow within the activity center 200). For instance, an electronically operated latch 221 on the treat door 222 can be activated to open the treat door 222, allowing Fluffy to access the treat 224.
  • George can send a command to the activity center 200 to open the treat door 222, e.g., if he watches Fluffy do something good, if he likes the artwork or music created by Fluffy, or for another reason.
  • the activity center 200 can determine whether to open the treat door 222, e.g., based on Fluffy's performance during an activity.
  • the activity center 200 may be programmed to open the treat door 222 if Fluffy focuses on a single activity for a certain amount of time (e.g., 5 minutes, 10 minutes, 15 minutes, or another time), if Fluffy performs well on the training activity, or for another reason.
  • a certain amount of time e.g., 5 minutes, 10 minutes, 15 minutes, or another time
  • the activity center 200 can have a scent box 226 inside of which (e.g., in a hollow within the activity center) a scented artifact 225, such as a device or mechanism capable of emitting an odor in response to a command, can be placed.
  • a scent box 226 inside of which (e.g., in a hollow within the activity center) a scented artifact 225, such as a device or mechanism capable of emitting an odor in response to a command, can be placed.
  • the scented artifact can be activated to emit its odor to alert Fluffy that the activity center 200 is turned on, that a particular activity is active (e.g., that the arts activity is active), that a particular time of day is approaching (e.g., that it is almost the end of the day and George will be home soon), or to alert Fluffy of another status.
  • a different odor can be used for each alert.
  • a chicken liver odor may alert Fluffy that the arts activity is active
  • a beef odor may alert Fluffy that it is lunch time
  • the odor of fresh dirt may alert Fluffy that George will be home soon to take Fluffy for a walk.
  • the time at which a particular odor is emitted can be specified in advance, e.g., George can program the odor schedule before he leaves for work in the morning.
  • George can send a command to the activity center 200 to cause a particular odor to be emitted, e.g., if George is planning to come home early.
  • the activity center 200 can have a health module 228 that can measure one or more parameters indicative of a physical status of Fluffy.
  • the health module 228 can measure Fluffy's heart rate, weight, muscle density, or other parameters.
  • the health module 228 can receive data from a wearable monitor 230, such as a heart rate monitor, worn by Fluffy.
  • the activity center 200 can use the parameters measured by the health module 228 to infer Fluffy's current activity level (e.g., whether Fluffy is active, calm, lethargic, or acting in another way), to track Fluffy's health over time, or both.
  • a camera assembly 300 is positioned near the activity center 200 to record Fluffy's interactions with the activity center 200.
  • One or more still or video cameras 302 are mounted on the camera assembly.
  • the camera assembly 300 includes three video cameras 302a, 302b, 302c mounted each at a different angle, however, more or fewer cameras can also be used.
  • the camera assembly 300 can be movably mounted on a base 304 such that the camera assembly 300 can revolve around the activity center 200.
  • the base can have a track 305 on which the camera assembly 300 is mounted such that the camera assembly 300 can travel around the circumference of the base 304.
  • the surface of the display 204 of the activity center 200 is transparent or translucent and a bottom camera 306 is positioned under the display 204 to take still or video images of Fluffy from below.
  • the cameras 302 mounted on the camera assembly and the bottom camera 306 provide four perspectives of Fluffy interacting with the activity center 200 and can be used to capture information that can be used to generate three-dimensional (3D) still or video images of Fluffy.
  • video images of Fluffy interacting with the activity center 200 can be streamed to George in real time. For instance, when Fluffy interacts with the arts and crafts activity or the music activity, the activity center 200 can send a message to George to alert him that streaming video of Fluffy is available for viewing.
  • Fluffy can wear a wearable camera, such as a GoPro® camera, that can acquire additional still or video images of Fluffy as she interacts with the activity center or as she goes about other activities.
  • the wearable camera can record images, stream images, or both when Fluffy does something to activate the camera, such as when Fluffy starts interacting with a particular section of the activity center 200 (e.g., the arts and crafts activity or the music activity), when Fluffy barks or growls, when Fluffy's heart rate exceeds a threshold level, or based on one or more other criteria.
  • the wearable camera can cause a message to be sent to George when the camera is activated; in some examples, the wearable camera can alert the activity center 200 that it has been activated and the activity center 200 can send a message to George.
  • a printing system 400 can be used to print two- dimensional (2D) or 3D representations of Fluffy interacting with the activity center 200.
  • the printing system 400 can include a 2D printer 402, such as an ink jet printer, a laser printer, a plotter, or another type of printer, a 3D printer 404, or both.
  • the printing system 400 can be in wired or wireless communication with the activity center 200. Images acquired by one or more of the cameras 302, 306 can be sent to the printing system 400 for printing.
  • the printing system 400 can also print a work of art created by Fluffy.
  • the 2D printer 402 can print a 2D, color version 406 of a painting Fluffy created and the 3D printer 404 can print a 3D statue 408 of Fluffy creating the painting, thus creating a work of art that depicts Fluffy creating her own art.
  • the activity center 200 activates the printing system 400 automatically.
  • the activity center 200 can cause the printing system 400 to print a 2D color version of Fluffy 's painting and a 3D statue of Fluffy creating that painting based on one or more criteria, such as after Fluffy has worked on the art activity for a certain amount of time (e.g., 2 minutes, 5 minutes, 10 minutes, or another amount of time), after the section 206d is a certain percentage filled with Fluffy 's painting (e.g., 50%, 60%, 70%, 80%, 90%, or 100%, or another amount), or based on one or more other criteria.
  • the printing system 400 can be activated manually by George. For instance, when George receives a message including a work of art that he particularly likes or showing Fluffy in a particularly cute pose, he may send a command to activate the printing system 400.

Abstract

Selon un aspect, la présente invention concerne un système comportant une surface tactile et une caméra montée pour être mobile par rapport à la surface tactile pour acquérir une image d'un animal interagissant avec la surface tactile. Le système comporte un processeur couplé à une mémoire, le processeur et la mémoire étant configurés pour entraîner l'association de chacune de la pluralité de sections de l'écran d'affichage à une activité correspondante à laquelle l'animal peut s'adonner; pour détecter une sélection d'une de la pluralité de sections de la surface tactile; et pour permettre l'activité associée à la section sélectionnée de la surface tactile.
PCT/US2015/053433 2014-10-01 2015-10-01 Activites d'animal assistees WO2016054332A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/504,104 2014-10-01
US14/504,104 US9185885B2 (en) 2013-10-18 2014-10-01 Digital activity center for pets

Publications (1)

Publication Number Publication Date
WO2016054332A1 true WO2016054332A1 (fr) 2016-04-07

Family

ID=55631487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/053433 WO2016054332A1 (fr) 2014-10-01 2015-10-01 Activites d'animal assistees

Country Status (1)

Country Link
WO (1) WO2016054332A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4875953A (en) * 1988-11-30 1989-10-24 Lloyd Christopher A Impression printing process for animals and children
US5424801A (en) * 1994-02-01 1995-06-13 Image Technology International, Inc. Dual mode 2D/3D printer
US20080314228A1 (en) * 2005-08-03 2008-12-25 Richard Dreyfuss Interactive tool and appertaining method for creating a graphical music display
US20100275851A1 (en) * 2009-04-30 2010-11-04 Sophia Yin System and method for training an animal
US20120006282A1 (en) * 2004-07-15 2012-01-12 Lawrence Kates Training guidance system for canines, felines, or other animals
WO2013170129A1 (fr) * 2012-05-10 2013-11-14 President And Fellows Of Harvard College Système et procédé de découverte, de caractérisation, de classification automatiques et d'étiquetage semi-automatique du comportement animal et phénotypage quantitatif de comportements chez des animaux
US20130319338A1 (en) * 2012-06-02 2013-12-05 Andrew Peter Davis Internet Canine Communication Device and Method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4875953A (en) * 1988-11-30 1989-10-24 Lloyd Christopher A Impression printing process for animals and children
US5424801A (en) * 1994-02-01 1995-06-13 Image Technology International, Inc. Dual mode 2D/3D printer
US20120006282A1 (en) * 2004-07-15 2012-01-12 Lawrence Kates Training guidance system for canines, felines, or other animals
US20080314228A1 (en) * 2005-08-03 2008-12-25 Richard Dreyfuss Interactive tool and appertaining method for creating a graphical music display
US20100275851A1 (en) * 2009-04-30 2010-11-04 Sophia Yin System and method for training an animal
WO2013170129A1 (fr) * 2012-05-10 2013-11-14 President And Fellows Of Harvard College Système et procédé de découverte, de caractérisation, de classification automatiques et d'étiquetage semi-automatique du comportement animal et phénotypage quantitatif de comportements chez des animaux
US20130319338A1 (en) * 2012-06-02 2013-12-05 Andrew Peter Davis Internet Canine Communication Device and Method

Similar Documents

Publication Publication Date Title
US20230105041A1 (en) Multi-media presentation system
US20160066547A1 (en) Assisted Animal Activities
Tomasello Origins of human communication
US20150022329A1 (en) Assisted Animal Communication
US20220171454A1 (en) Method and apparatus to compose a story for a user depending on an attribute of the user
Walters The design space for robot appearance and behaviour for social robot companions
US9485963B2 (en) Assisted animal activities
Resner Rover@ Home: Computer mediated remote interaction between humans and dogs
Konecki Touching and gesture exchange as an element of emotional bond construction. Application of visual sociology in the research on interaction between humans and animals
Diana My robot gets me: how social design can make new products more human
Niewiadomski et al. Social robots as eating companions
WO2015009487A1 (fr) Activités d'animal assistées
Nijholt Towards Social Companions in Augmented Reality: Vision and Challenges
WO2016054332A1 (fr) Activites d'animal assistees
French User Experience for Elephants: Researching Interactive Enrichment through Design and Craft
Beresin et al. Play signals, play moves: a gorilla critique of play theory
Hall et al. Barking up the wrong tree: a qualitative study of the potential for dog-owner technology
Brown et al. Sharing stories we know
WO2023017732A1 (fr) Dispositif de création d'informations de narration, robot de narration, procédé de création d'informations de narration et programme
Browne The effects of delayed positive reinforcement on learning in dogs
Bazalgette Even a two-year-old can do it! The early stages of learning to understand moving-image media
Miller Guess the Emotion: A Tablet Game to Support Emotion Regulation Skills for Children with Autism
Ahovi Come play with me!: Attracting passers-by to interactive floors in semi-public space
Lehtonen Designing Features for Fido: What Makes Animal-Computer Interaction So Different From Human-Computer Interaction?
Wessler " Playing for Real": Six Adolescent Girls in an After-School Program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15846786

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15846786

Country of ref document: EP

Kind code of ref document: A1