US20220309787A1 - Systems and methods for applied machine cognition - Google Patents
Systems and methods for applied machine cognition Download PDFInfo
- Publication number
- US20220309787A1 US20220309787A1 US17/833,367 US202217833367A US2022309787A1 US 20220309787 A1 US20220309787 A1 US 20220309787A1 US 202217833367 A US202217833367 A US 202217833367A US 2022309787 A1 US2022309787 A1 US 2022309787A1
- Authority
- US
- United States
- Prior art keywords
- information
- personality
- assigning
- further comprise
- obtaining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title description 34
- 230000019771 cognition Effects 0.000 title description 7
- 230000003993 interaction Effects 0.000 claims abstract description 27
- 230000007613 environmental effect Effects 0.000 claims abstract description 13
- 238000004891 communication Methods 0.000 claims description 22
- 230000003190 augmentative effect Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 8
- 238000004519 manufacturing process Methods 0.000 claims description 4
- 238000003860 storage Methods 0.000 description 30
- 230000008859 change Effects 0.000 description 12
- 230000009471 action Effects 0.000 description 8
- 230000036651 mood Effects 0.000 description 6
- 206010037180 Psychiatric symptoms Diseases 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000001815 facial effect Effects 0.000 description 4
- 230000007794 irritation Effects 0.000 description 4
- 238000011022 operating instruction Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000005406 washing Methods 0.000 description 3
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001052 transient effect Effects 0.000 description 2
- 206010034719 Personality change Diseases 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000002354 daily effect Effects 0.000 description 1
- 238000009429 electrical wiring Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/52—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B1/00—Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
- H04B1/38—Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
- H04B1/3827—Portable transceivers
Definitions
- An inanimate object may be considered a thing that is not alive, such as a rock, a chair, or a book. This definition can be expanded to include that an inanimate object may lack any sign of life or consciousness.
- the internet of things is rapidly emerging and accelerating the integration of inanimate objects into everyday life. This disclosure is directed to addressing issues in the existing technology.
- an apparatus may include a processor and a memory coupled with the processor that effectuates operations.
- the operations may include obtaining information about an object over a period; comparing the information to usage factors, interaction factors, or environmental factors that are linked to thresholds for a personality; based on the comparing, assigning the personality to the object; and based on the personality of the object, assigning an avatar to the object.
- FIG. 1 illustrates an exemplary system associated with Systems and Methods for Applied Machine Cognition.
- FIG. 2 illustrates an exemplary method associated with Systems and Methods for Applied Machine Cognition.
- FIG. 3 illustrates a schematic of an exemplary network device.
- FIG. 4 illustrates an exemplary communication system that provides wireless telecommunication services over wireless communication networks.
- an inanimate object may have a personality reflected in a similar way, in which a first percentage (e.g., 40%) of the inanimate object's personality may be attributable to the inanimate object's model or make (e.g., default configuration by manufacturer or seller) and a second amount of the inanimate object's personality may be attributable to external factors (e.g., usage factors, interaction factors, environment factors, etc.).
- a first percentage e.g. 40%
- the inanimate object's personality may be attributable to the inanimate object's model or make (e.g., default configuration by manufacturer or seller)
- a second amount of the inanimate object's personality may be attributable to external factors (e.g., usage factors, interaction factors, environment factors, etc.).
- FIG. 1 illustrates an exemplary system associated with systems and methods for applied machine cognition (also referred herein as objects).
- System 100 may include multiple inanimate objects (also referred herein as objects), such as mobile device 101 , mobile device 102 , mobile device 103 , appliance 104 , sensor 105 , bike 106 (e.g., stationary bicycle with integrated computing technology), purse 109 , or server 107 that may be communicatively connected with each other, using wireless or wireline, via network 108 .
- Server 107 may include a personality engine that may process the object information as disclosed in more detail herein.
- Mobile device 101 , mobile device 102 , or mobile device 103 may include, wireless devices, such as satellite communication systems, portable digital assistants (PDAs), laptop computers, tablet devices, smart phones, smart watches, smart speakers, automobiles (e.g., autonomous vehicles), augmented reality devices, virtual reality devices, or the like.
- Appliance 104 may include ranges, wall ovens, refrigerators, dishwashers, washing machines, dryers, smart bulbs, or coffee makers.
- Sensor 105 may include an environmental sensor, acoustic sensor, sound sensor, vibration sensor, fluid sensor, optical sensor, position sensor (e.g., accelerometer or gyroscope), speed sensor, chemical sensor, pressure sensor, or the like. Sensor 105 may be substantially integrated into a device (e.g., appliance 104 or purse 109 ) or may be a stand-alone device.
- FIG. 2 illustrates an exemplary method associated with systems and methods for applied machine cognition.
- appliance 104 i.e., an inanimate object
- server 107 may be registered with server 107 .
- a profile may be created.
- the profile may include descriptive information about appliance 104 that is collected by server 107 .
- the descriptive information may include manufacturer, date of manufacture, object identifier (e.g., alphanumeric), date of first use in service at any location, date of first use of service at a particular location, initial geographical coordinate position, position in or around a facility (e.g., home or business facility), general time of use, time of use in different modes (e.g., color cycle or whites cycle for a washer), features, photo of the appliance 104 , or the like.
- object identifier e.g., alphanumeric
- date of first use in service at any location date of first use of service at a particular location
- initial geographical coordinate position position in or around a facility (e.g., home or business facility)
- general time of use time of use in different modes (e.g., color cycle or whites cycle for a washer)
- features e.g., photo of the appliance 104 , or the like.
- the profile may include information associated with display features or audio features for appliance 104 .
- Display features may include avatars or alert formatting, among other things.
- Audio features may include assigned voices for an avatar or appliance 104 , assigned audio for an alert, or the like.
- the profile for appliance 104 may include other features, such as alert triggers (e.g., alert based on proximity of mobile device 101 to appliance 104 ), social media features (e.g., automatically posting status to social media), or linking to a group or other outside entities (e.g., appliance linked to other internal or external appliances), among other things.
- One or more avatars may be selected for appliance 104 .
- a profile may have a different avatar (e.g., different persons) with different facial structure (e.g., hairstyle, age, gender, ethnicity) for each mood (e.g., cheerful, amused, blissful, calm, cheerful, content, energetic, etc.).
- a profile may have the same avatar for each mood, in which the facial structure may be primarily the same (e.g., same person), but with different styling (e.g., hair or clothing) or facial expressions (e.g., smile or frown).
- Personalities may be considered long term and may change more gradually than mood, although there may be some overlap.
- Personality tests e.g., Myers Briggs
- personality tests may help determine types of personalities that may be based on multiple preferences, such as extraversion, introversion, intuition, sensing, thinking, feeling, judgment, or perception.
- personality will be used herein, it is contemplated herein that mood may be included in the term personality, unless otherwise provided.
- object information associated with appliance 104 may be linked to one or more personalities of appliance 104 .
- Object information may include descriptive information, usage factors (e.g., how the product is used), interaction factors (e.g., what the object communicates with), or environmental factors (e.g., data describing the environment in which the object operates), among other things.
- Server 107 may link the object information to different thresholds that correspond to different personalities, which may be scaled (e.g., high or medium happiness). The different personalities may also be linked to different avatars or other physical or virtual display output.
- Example object information is listed in Table 1.
- server 107 may obtain (e.g., receive) object information from appliance 104 over a period.
- appliance 104 e.g., a water heater in a home
- object information such as events that it detects during its operation
- the object information may be collected over a period.
- the object information may be reported by appliance 104 or sensors associated with sensor 105 .
- sensor 105 may report motion or noise.
- Sensor 105 may be a camera that detects motion, noise, interactions, or the like.
- a camera may be helpful when appliance 104 is not communicatively connected with server 107 , but sensor 105 has the ability to provide some or all of the object information to server 107 .
- the camera may record video of the object or environment proximate to the object (based on line of sight) and the video and other object information may be provided to server 107 . It is contemplated that similar to facial recognition, video, audio, or photo recognition of an object may be used to identify object information as disclosed herein.
- appliance 104 When appliance 104 sends the object information to server 107 , it may include usage factors data, such as a time period, an energy efficiency rating for the time period, amount of energy used, or volume of water used, among other things.
- usage factors data such as a time period, an energy efficiency rating for the time period, amount of energy used, or volume of water used, among other things.
- the object information may also include interaction factors data, such as user IDs (e.g., of the people or object) detected in the home during the time period. This may be via detection of personal electronic devices of the people present in the home (e.g., mobile device 101 ) or other sensors 105 (e.g., a camera).
- the object information may also include environmental factors data, such as data collected from sensors (e.g., temperature or humidity sensors on or near appliance 104 ).
- server 107 may translate the usage factors data, interaction factors data, environmental factors data, or other object information into scores on a personality scale.
- This personality scale e.g., a Myers Briggs-like model
- the object information may be used to determine where appliance 104 rates on a measured scale for these traits.
- a personality as shown in Table 2 may be created for a period (e.g., current period). As time passes, default personalities may be predicted based on the object information, such as shown in Table 3.
- server 107 might process object information to determine that during the summer, appliance 104 runs in a hot and humid environment and appliance 104 also detects that this is when Susie is home from college (e.g., mobile device 102 ) and takes long showers. Therefore, appliance 104 may be represented by default with a personality of unhappy and uncomfortable during the summer. This summer personality information may change over the course of the summer based on the object information gathered over time. Moreover, the interactions of appliance 104 with other products or things may affect its personality representation. For instance, appliance 104 (e.g., the water heater) may be old or running an old software version.
- appliance 104 may be unable to implement energy efficient updated operating instructions that it receives, from mobile device 103 , which is an interaction with another thing. If appliance 104 constantly receives messages with regard to do something (implementing the operating instructions) that it cannot, appliance 104 may “feel” irritated. This irritated personality may be associated with a threshold time frame that the updated operating instructions were received and a threshold number of messages associated with the updated operating instructions.
- a personality scale may use a well-known human personality model to identify and assign the personality of appliance 104 .
- Other information such as the age of appliance 104 or a history of location of appliance 104 might be used to further the personality representation of appliance 104 .
- an action may be taken associated with the personality.
- An example action is appliance 104 may be assigned an avatar whose personality is representative of the data. For example, if the personality corresponds to Table 2, then an avatar that resembled a grumpy man or woman may be assigned. From there, this personality representation may be presented on a display on mobile device 101 (e.g., a smartphone or augmented reality device) to a user along with a link to additional information that may summarize significant object information that contributes to the determined personality representation.
- Significant object information e.g., top 3 factors
- appliance 104 may be assigned a voice type to be used for text-to-speech communications.
- the voice may already be associated with the avatar or separately assigned.
- the voice assigned to appliance 104 might, for example, be a grumpy old man or irritated toddler and may be used when reporting a requested status of appliance 104 .
- the status may be requested via mobile device 101 (e.g., smart speaker, smartphone, augmented reality goggles, etc.).
- a social media post associated with the personality of appliance 104 may be automatically posted. If appliance 104 is particularly happy for an extended period of time (e.g., a threshold amount), then a social media post may show an avatar or meme that is indicative of the happy personality. In another example, appliance 104 may initiate a communication with the user of mobile device 101 . A dishwasher may call (or otherwise alert) mobile device 101 to tell the user that it was not turned on by co-opting human communication channels (e.g., a phone or video chat).
- co-opting human communication channels e.g., a phone or video chat
- Another example action may be an increase or decrease in the number of alerts or other interactions with regard to appliance 104 , because of the personality of appliance 104 . If appliance 104 is determined to have an introvert personality then less proactive alerts may be provided about appliance 104 's status. If appliance 104 is an extrovert, then more proactive alerts may be provided. The proximity of a user may determine whether an alert is sent or not. If the user is proximate (e.g., near) appliance 104 , an alert may be sent.
- appliance 104 may be relatively old and out of date. Appliance 104 may receive instructions on energy efficiency operation from a home network, but fail to implement them. Based on an assigned personality, appliance 104 may be represented as an avatar (or meme) indicative of “Old Dog Not Able to Learn New Tricks.” In another example, appliance 104 may have connected with a large number of Wi-Fi networks in different countries. Based on an assigned personality, appliance 104 may be represented as an avatar (or meme) indicative of a “World Traveler that is Open to New Connections.” In another example, an avatar associated with appliance 104 may be have an age progression or facial change based on time or amount of use.
- an inanimate object may be a bike 106 (e.g., connected exercise bike).
- object information might indicate bike 106 is used only once per month on average and is relatively new.
- object information may indicate that other exercise bikes may have tried to invite bike 106 (e.g., associated with other end users) to participate in group exercise sessions, but bike 106 has not replied.
- bike 106 may have a personality associated with a young, lonely introvert.
- Bike 106 may send an alert to ask the user if it should list itself for sale online or automatically list itself for sale, which may be via social media.
- an inanimate object may be a security camera (e.g., sensor 105 ) and may have an avatar that is indicative of a professional police officer.
- the avatar may be based on sensor 105 regularly (e.g., every week or every month) sending video clips to a local police department that has been indicated as useful and high quality.
- a default personality e.g., preinstalled personality
- the default personality may be representative of a company or product.
- a certain model bike 106 may be considered an introvert until it adds one or more features (e.g., physical or digital) to bike 106 , such as a designated handle bar or a designated genre of music.
- a car may come with a pre-installed personality, which may change over time based on object information associated with how a user drives the car.
- the object information may include high speeds, sudden stops, or the number of yellow lights run, among other things.
- Usage factors or other object information may be associated with more than one user to create an aggregate personality that reflects shared usage.
- a home may be represented based on aggregate interactions of multiple users (e.g., persons or other inanimate objects) with many of the inanimate objects within the home.
- a personality may be transferred from one thing to another. For instance, when buying a new car, the previous car's personality may be transplanted into it.
- An inanimate object may have a personality that changes over time and have a corresponding avatar that changes its appearance based on its current personality.
- purse 109 may be associated with a user and may be connected with network 108 (e.g., an integrated mobile device or a smartphone that has a virtual assistant application running on it).
- the purse may detect interaction with other users via a virtual assistant application that uses speech recognition. For instance, if the virtual assistant application detects speech around it that is complimentary of the purse, it may collect this as object information (e.g., interaction factor data) and send it to server 107 . Server 107 may note this as a trend in increasing confidence for purse 109 .
- object information e.g., interaction factor data
- server 107 may note this as a trend in increasing confidence for purse 109 .
- purse 109 may change its appearance to illuminate lights, electronically change the visual appearance of the fabric of purse 109 (e.g., electrical stimulus that changes temperature and color of the fabric), or make other changes.
- the user may train the inanimate object personality via a speech interface (e.g., a virtual assistant). For instance, if the bike 106 is communicating that it is lonely, the user may “reassure it” by scheduling a workout time on an electronic calendar. Based on this interaction the personality of bike 106 may change and actions may be taken in response to such personality change, as disclosed herein.
- an electronic calendar may include events that are associated with an inanimate object. For example, there may be a calendar event to run in the morning each day. If the calendar event is not performed at all or below a threshold, then a user's shoes may be assigned a personality of loneliness or irritation.
- an alert may be provided to a display technology (e.g., augmented reality) that shows a message, a change in avatar appearance or voice, or the like.
- a display technology e.g., augmented reality
- Other events connected with inanimate object are contemplated such as riding bike 106 , washing car, washing clothes, cleaning home, reading a book, or cooking (e.g., stove may be the inanimate object), among other things.
- server 107 might detect that an inanimate object (e.g., mobile device 103 ) is being misused. For instance, via an imbedded accelerometer in mobile device 103 , an impact by mobile device 103 may be indicated as mobile device 103 being slammed. Based on this detection, mobile device 103 may actively participate in its own self-preservation and a personality indicative of despair (e.g., an emergency) may be assigned. Based on the personality an action may be taken such as reporting the misuse to an appropriate entity. In another example, misuse may be an attempted software hack that the mobile device 103 detects upon itself, or a detection that the personalities of other things that it interacts (and shares data with ⁇ are of low integrity.
- an inanimate object e.g., mobile device 103
- augmented reality or virtual reality goggles may be used.
- a person may be able to walk around their home or business and easily understand the personality (or mood) of the inanimate objects and quickly address any issues that may provide a real-world performance change in the object or some other significant, but less direct benefit.
- a real-world change such as tightening a screw, releasing a valve, or reducing usage may be performed. This may positively affect the performance of the inanimate object.
- an inanimate object may have a personality comprising loneliness which could be easily observed using augmented reality.
- a book may be picked up, a couch may be sat on, or rarely used bike 106 may be used to decrease the loneliness of the inanimate object, but not necessarily increase or decrease the performance of the inanimate object directly.
- a change may be made in the personality data, but also a change may be made in the person's daily habits, individual performance, or upkeep of a home.
- the indirect benefit may assist in discovering possibly forgotten inanimate objects in the home, which may cause for more efficient use of the home, keeping a person more physically fit, or efficient use of the person's resources, such as automatically posting the book for sale online or reading it for the first time and not purchasing another book to read.
- an inanimate object e.g., a smartphone
- one or more devices may participate in the methods disclosed herein. Some steps may be distributed over a plurality of devices, such as server 107 , mobile device 101 , appliance 104 , sensor 105 , or bike 106 . It is also contemplated that the actions, object information, or the like may be displayed on a mobile phone screen, computer screen, virtual reality screen, or augmented reality screen (glasses or mobile phone), among other display technologies.
- personality is often broken into statistically-identified components. These components are generally stable over time, and about half of the variance may be attributable to a person's genetics and the other half based on effects of one's environment.
- a first amount e.g., half
- a second amount of the inanimate object's personality may be attributable to external factors (e.g., usage factors, interaction factors, environment factors, etc.).
- FIG. 3 is a block diagram of network device 300 that may be connected to or comprise a component of system 100 .
- Network device 300 may comprise hardware or a combination of hardware and software. The functionality to facilitate telecommunications via a telecommunications network may reside in one or combination of network devices 300 .
- network 3 may represent or perform functionality of an appropriate network device 300 , or combination of network devices 300 , such as, for example, a component or various components of a cellular broadcast system wireless network, a processor, a server, a gateway, a node, a mobile switching center (MSC), a short message service center (SMSC), an automatic location function server (ALFS), a gateway mobile location center (GMLC), a radio access network (RAN), a serving mobile location center (SMLC), or the like, or any appropriate combination thereof.
- MSC mobile switching center
- SMSC short message service center
- ALFS automatic location function server
- GMLC gateway mobile location center
- RAN radio access network
- SMLC serving mobile location center
- network device 300 may be implemented in a single device or multiple devices (e.g., single server or multiple servers, single gateway or multiple gateways, single controller or multiple controllers). Multiple network entities may be distributed or centrally located. Multiple network entities may communicate wirelessly, via hard wire, or any appropriate combination thereof.
- Network device 300 may comprise a processor 302 and a memory 304 coupled to processor 302 .
- Memory 304 may contain executable instructions that, when executed by processor 302 , cause processor 302 to effectuate operations associated with mapping wireless signal strength.
- network device 300 is not to be construed as software per se.
- network device 300 may include an input/output system 306 .
- Processor 302 , memory 304 , and input/output system 306 may be coupled together (coupling not shown in FIG. 3 ) to allow communications between them.
- Each portion of network device 300 may comprise circuitry for performing functions associated with each respective portion. Thus, each portion may comprise hardware, or a combination of hardware and software. Accordingly, each portion of network device 300 is not to be construed as software per se.
- Input/output system 306 may be capable of receiving or providing information from or to a communications device or other network entities configured for telecommunications.
- input/output system 306 may include a wireless communications (e.g., 3G/4G/GPS) card.
- Input/output system 306 may be capable of receiving or sending video information, audio information, control information, image information, data, or any combination thereof. Input/output system 306 may be capable of transferring information with network device 300 . In various configurations, input/output system 306 may receive or provide information via any appropriate means, such as, for example, optical means (e.g., infrared), electromagnetic means (e.g., RF, Wi-Fi, Bluetooth®, ZigBee®), acoustic means (e.g., speaker, microphone, ultrasonic receiver, ultrasonic transmitter), or a combination thereof. In an example configuration, input/output system 306 may comprise a Wi-Fi finder, a two-way GPS chipset or equivalent, or the like, or a combination thereof.
- optical means e.g., infrared
- electromagnetic means e.g., RF, Wi-Fi, Bluetooth®, ZigBee®
- acoustic means e.g., speaker, microphone, ultra
- Input/output system 306 of network device 300 also may contain a communication connection 308 that allows network device 300 to communicate with other devices, network entities, or the like.
- Communication connection 308 may comprise communication media.
- Communication media typically embody computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
- communication media may include wired media such as a wired network or direct-wired connection, or wireless media such as acoustic, RF, infrared, or other wireless media.
- the term computer-readable media as used herein includes both storage media and communication media.
- Input/output system 306 also may include an input device 310 such as keyboard, mouse, pen, voice input device, or touch input device. Input/output system 306 may also include an output device 312 , such as a display, speakers, or a printer.
- input device 310 such as keyboard, mouse, pen, voice input device, or touch input device.
- output device 312 such as a display, speakers, or a printer.
- Processor 302 may be capable of performing functions associated with telecommunications, such as functions for processing broadcast messages, as described herein.
- processor 302 may be capable of, in conjunction with any other portion of network device 300 , determining a type of broadcast message and acting according to the broadcast message type or content, as described herein.
- Memory 304 of network device 300 may comprise a storage medium having a concrete, tangible, physical structure. As is known, a signal does not have a concrete, tangible, physical structure. Memory 304 , as well as any computer-readable storage medium described herein, is not to be construed as a signal. Memory 304 , as well as any computer-readable storage medium described herein, is not to be construed as a transient signal. Memory 304 , as well as any computer-readable storage medium described herein, is not to be construed as a propagating signal. Memory 304 , as well as any computer-readable storage medium described herein, is to be construed as an article of manufacture.
- Memory 304 may store any information utilized in conjunction with telecommunications. Depending upon the exact configuration or type of processor, memory 304 may include a volatile storage 314 (such as some types of RAM), a nonvolatile storage 316 (such as ROM, flash memory), or a combination thereof. Memory 304 may include additional storage (e.g., a removable storage 318 or a non-removable storage 320 ) including, for example, tape, flash memory, smart cards, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, USB-compatible memory, or any other medium that can be used to store information and that can be accessed by network device 300 . Memory 304 may comprise executable instructions that, when executed by processor 302 , cause processor 302 to effectuate operations to map signal strengths in an area of interest.
- volatile storage 314 such as some types of RAM
- nonvolatile storage 316 such as ROM, flash memory
- additional storage e.g., a removable storage 318 or a
- FIG. 4 depicts an exemplary diagrammatic representation of a machine in the form of a computer system 500 within which a set of instructions, when executed, may cause the machine to perform any one or more of the methods described above.
- One or more instances of the machine can operate, for example, as processor 302 , mobile device 101 , mobile device 102 , mobile device 103 , appliance 104 , sensor 105 , and other devices of FIG. 1 .
- the machine may be connected (e.g., using a network 502 ) to other machines.
- the machine may operate in the capacity of a server or a client user machine in a server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet, a smart phone, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
- a communication device of the subject disclosure includes broadly any electronic device that provides voice, video or data communication.
- the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
- Computer system 500 may include a processor (or controller) 504 (e.g., a central processing unit (CPU)), a graphics processing unit (GPU, or both), a main memory 506 and a static memory 508 , which communicate with each other via a bus 510 .
- the computer system 500 may further include a display unit 512 (e.g., a liquid crystal display (LCD), a flat panel, or a solid state display).
- Computer system 500 may include an input device 514 (e.g., a keyboard), a cursor control device 516 (e.g., a mouse), a disk drive unit 518 , a signal generation device 520 (e.g., a speaker or remote control) and a network interface device 522 .
- the embodiments described in the subject disclosure can be adapted to utilize multiple display units 512 controlled by two or more computer systems 500 .
- presentations described by the subject disclosure may in part be shown in a first of display units 512 , while the remaining portion is presented in a second of display units 512 .
- the disk drive unit 518 may include a tangible computer-readable storage medium 524 on which is stored one or more sets of instructions (e.g., software 526 ) embodying any one or more of the methods or functions described herein, including those methods illustrated above. Instructions 526 may also reside, completely or at least partially, within main memory 506 , static memory 508 , or within processor 504 during execution thereof by the computer system 500 . Main memory 506 and processor 504 also may constitute tangible computer-readable storage media.
- While examples of a telecommunications system in which systems and methods for applied machine cognition can be processed and managed have been described in connection with various computing devices/processors, the underlying concepts may be applied to any computing device, processor, or system capable of facilitating a telecommunications system.
- the various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both.
- the methods and devices may take the form of program code (i.e., instructions) embodied in concrete, tangible, storage media having a concrete, tangible, physical structure. Examples of tangible storage media include floppy diskettes, CD-ROMs, DVDs, hard drives, or any other tangible machine-readable storage medium (computer-readable storage medium).
- a computer-readable storage medium is not a signal.
- a computer-readable storage medium is not a transient signal. Further, a computer-readable storage medium is not a propagating signal.
- a computer-readable storage medium as described herein is an article of manufacture.
- the program code When the program code is loaded into and executed by a machine, such as a computer, the machine becomes a device for telecommunications.
- the computing device In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile or nonvolatile memory or storage elements), at least one input device, and at least one output device.
- the program(s) can be implemented in assembly or machine language, if desired.
- the language can be a compiled or interpreted language, and may be combined with hardware implementations.
- the methods and devices associated with a telecommunications system as described herein also may be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or the like, the machine becomes a device for implementing telecommunications as described herein.
- a machine such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or the like
- PLD programmable logic device
- client computer or the like
- the program code When implemented on a general-purpose processor, the program code combines with the processor to provide a unique device that operates to invoke the functionality of a telecommunications system.
- a telecommunications system has been described in connection with the various examples of the various figures, it is to be understood that other similar implementations may be used or modifications and additions may be made to the described examples of a telecommunications system without deviating therefrom.
- a telecommunications system as described in the instant application may apply to any environment, whether wired or wireless, and may be applied to any number of such devices connected via a communications network and interacting across the network. Therefore, a telecommunications system as described herein should not be limited to any single example, but rather should be construed in breadth and scope in accordance with the appended claims.
- An inanimate object is considered anything that lacks consciousness (e.g., a lawn or an apple). With that said, it is contemplated herein that the techniques used for inanimate objects may be used to assign personalities to people (e.g., through observing activities through a camera and a mobile device of the person).
- Methods, systems, and apparatuses, among other things, as described herein may provide for obtaining first information about an object over a first period; based on the obtaining the first information about the object, assigning a first avatar to the object; obtaining second information about the object over a second period; and based on the obtaining the second information about the object, assigning a second avatar to the object.
- Methods, systems, and apparatuses, among other things, as described herein may provide for obtaining first information about an object over a first period; comparing the first information to usage factors, interaction factors, or environmental factors that are linked to thresholds for a personality; based on the comparing, assigning the personality to the object; and based on the personality of the object, assigning an avatar to the object.
- An apparatus may assign a personality to the object.
- the assigning the personality to the object may comprise assigning an avatar to the object, and the operations further comprising providing instructions to display the avatar via augmented reality
- the method, system, computer readable storage medium, or apparatus may, based on the personality of the object, assign audio to the object; automatically post information associated with the personality of the object to social media; send a request to post information about the object to social media; send a reduced number of alerts associated with the object when the personality is introversion; or display the avatar via augmented reality.
- the method, system, computer readable storage medium, or apparatus may, based on the personality of the object, assign audio to the object.
- the first information about the object over the first period may be from a camera in proximity.
- the method, system, computer readable storage medium, or apparatus obtaining first information about an object over a first period; comparing the first information to usage factors, interaction factors, or environmental factors that define personalities; and based on the comparing, assigning a personality to the object.
- the method, system, computer readable storage medium, or apparatus automatically posting information (e.g., social media, SMS text, web page, or other alert for display) associated with the personality of the object. All combinations in this paragraph (including the removal or addition of steps) are contemplated in a manner that is consistent with the other portions of the detailed description.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Information Transfer Between Computers (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A server may assign human personality traits to inanimate objects. In an example, a server may effectuate operations that include obtaining information about an object over a period; comparing the information to usage factors, interaction factors, or environmental factors that are linked to thresholds for a personality; based on the comparing, assigning the personality to the object; and based on the personality of the object, assigning an avatar to the object.
Description
- This application is a continuation of U.S. application Ser. No. 16/712,274, filed Dec. 12, 2019. All sections of the aforementioned application are incorporated herein by reference in their entirety.
- An inanimate object may be considered a thing that is not alive, such as a rock, a chair, or a book. This definition can be expanded to include that an inanimate object may lack any sign of life or consciousness. The internet of things is rapidly emerging and accelerating the integration of inanimate objects into everyday life. This disclosure is directed to addressing issues in the existing technology.
- This background information is provided to reveal information believed by the applicant to be of possible relevance. No admission is necessarily intended, nor should be construed, that any of the preceding information constitutes prior art.
- There are a number of factors that may be used to describe how an object operates or is used. When the object has capabilities for electronic communications over a network and the ability to store and process electronic data, these factors may be determined and recorded and, over time, may be used to create a set of electronically stored data that represents a “personality” of the object. Disclosed herein are methods, systems, and apparatuses for applied machine cognition which may include assigning, modifying, or otherwise using human personality traits for inanimate objects.
- In an example, an apparatus may include a processor and a memory coupled with the processor that effectuates operations. The operations may include obtaining information about an object over a period; comparing the information to usage factors, interaction factors, or environmental factors that are linked to thresholds for a personality; based on the comparing, assigning the personality to the object; and based on the personality of the object, assigning an avatar to the object.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to limitations that solve any or all disadvantages noted in any part of this disclosure.
- Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale.
-
FIG. 1 illustrates an exemplary system associated with Systems and Methods for Applied Machine Cognition. -
FIG. 2 illustrates an exemplary method associated with Systems and Methods for Applied Machine Cognition. -
FIG. 3 illustrates a schematic of an exemplary network device. -
FIG. 4 illustrates an exemplary communication system that provides wireless telecommunication services over wireless communication networks. - Conventionally, personality is often broken into statistically-identified components. These components may be stable over time, and about half of the variance may be attributable to a person's genetics and the other half based on effects of one's environment. With this context in mind, as disclosed herein, an inanimate object may have a personality reflected in a similar way, in which a first percentage (e.g., 40%) of the inanimate object's personality may be attributable to the inanimate object's model or make (e.g., default configuration by manufacturer or seller) and a second amount of the inanimate object's personality may be attributable to external factors (e.g., usage factors, interaction factors, environment factors, etc.).
- There are a number of factors that may be used to describe how an object operates or is used. When the object has capabilities for electronic communications over a network and the ability to store and process electronic data, these factors may be determined and recorded and, over time, may be used to create a set of electronically stored data that represents a “personality” of the object. Disclosed herein are methods, systems, and apparatuses for assigning, modifying, or otherwise using human personality traits for objects.
-
FIG. 1 illustrates an exemplary system associated with systems and methods for applied machine cognition (also referred herein as objects).System 100 may include multiple inanimate objects (also referred herein as objects), such asmobile device 101,mobile device 102,mobile device 103,appliance 104,sensor 105, bike 106 (e.g., stationary bicycle with integrated computing technology),purse 109, orserver 107 that may be communicatively connected with each other, using wireless or wireline, vianetwork 108.Server 107 may include a personality engine that may process the object information as disclosed in more detail herein.Mobile device 101,mobile device 102, ormobile device 103 may include, wireless devices, such as satellite communication systems, portable digital assistants (PDAs), laptop computers, tablet devices, smart phones, smart watches, smart speakers, automobiles (e.g., autonomous vehicles), augmented reality devices, virtual reality devices, or the like.Appliance 104 may include ranges, wall ovens, refrigerators, dishwashers, washing machines, dryers, smart bulbs, or coffee makers.Sensor 105 may include an environmental sensor, acoustic sensor, sound sensor, vibration sensor, fluid sensor, optical sensor, position sensor (e.g., accelerometer or gyroscope), speed sensor, chemical sensor, pressure sensor, or the like.Sensor 105 may be substantially integrated into a device (e.g.,appliance 104 or purse 109) or may be a stand-alone device. -
FIG. 2 illustrates an exemplary method associated with systems and methods for applied machine cognition. Atstep 121, appliance 104 (i.e., an inanimate object) may be registered withserver 107. During the registration process a profile may be created. The profile may include descriptive information aboutappliance 104 that is collected byserver 107. The descriptive information may include manufacturer, date of manufacture, object identifier (e.g., alphanumeric), date of first use in service at any location, date of first use of service at a particular location, initial geographical coordinate position, position in or around a facility (e.g., home or business facility), general time of use, time of use in different modes (e.g., color cycle or whites cycle for a washer), features, photo of theappliance 104, or the like. - In addition to the descriptive information, the profile may include information associated with display features or audio features for
appliance 104. Display features may include avatars or alert formatting, among other things. Audio features may include assigned voices for an avatar orappliance 104, assigned audio for an alert, or the like. The profile forappliance 104 may include other features, such as alert triggers (e.g., alert based on proximity ofmobile device 101 to appliance 104), social media features (e.g., automatically posting status to social media), or linking to a group or other outside entities (e.g., appliance linked to other internal or external appliances), among other things. - One or more avatars may be selected for
appliance 104. There may be avatars that are representative of each human personality or human mood, which is described in more detail below. In a first scenario, a profile may have a different avatar (e.g., different persons) with different facial structure (e.g., hairstyle, age, gender, ethnicity) for each mood (e.g., cheerful, amused, blissful, calm, cheerful, content, energetic, etc.). In a second scenario, a profile may have the same avatar for each mood, in which the facial structure may be primarily the same (e.g., same person), but with different styling (e.g., hair or clothing) or facial expressions (e.g., smile or frown). Personalities may be considered long term and may change more gradually than mood, although there may be some overlap. Personality tests (e.g., Myers Briggs) may help determine types of personalities that may be based on multiple preferences, such as extraversion, introversion, intuition, sensing, thinking, feeling, judgment, or perception. Although the term personality will be used herein, it is contemplated herein that mood may be included in the term personality, unless otherwise provided. - With continued reference to
step 121 ofFIG. 2 , object information associated withappliance 104 may be linked to one or more personalities ofappliance 104. Object information may include descriptive information, usage factors (e.g., how the product is used), interaction factors (e.g., what the object communicates with), or environmental factors (e.g., data describing the environment in which the object operates), among other things.Server 107 may link the object information to different thresholds that correspond to different personalities, which may be scaled (e.g., high or medium happiness). The different personalities may also be linked to different avatars or other physical or virtual display output. Example object information is listed in Table 1. -
TABLE 1 Usage Interaction Environmental Factors Factors Factors Time Used Person Personality Temperature Duration Used Nature of Interaction Humidity with Person Proper Usage Percentage Other Object Personality Noise Improper Usage Percentage Nature of Interaction Motion with Other Object Identifier of Other Light Object that is Using Identifier of Person that is Using - With continued reference to
FIG. 2 , atstep 122,server 107 may obtain (e.g., receive) object information fromappliance 104 over a period. As an example, appliance 104 (e.g., a water heater in a home) may record object information, such as events that it detects during its operation, and report the object information toserver 107. The object information may be collected over a period. The object information may be reported byappliance 104 or sensors associated withsensor 105. For example,sensor 105 may report motion or noise.Sensor 105 may be a camera that detects motion, noise, interactions, or the like. The use of a camera may be helpful whenappliance 104 is not communicatively connected withserver 107, butsensor 105 has the ability to provide some or all of the object information toserver 107. The camera may record video of the object or environment proximate to the object (based on line of sight) and the video and other object information may be provided toserver 107. It is contemplated that similar to facial recognition, video, audio, or photo recognition of an object may be used to identify object information as disclosed herein. - When
appliance 104 sends the object information toserver 107, it may include usage factors data, such as a time period, an energy efficiency rating for the time period, amount of energy used, or volume of water used, among other things. The object information may also include interaction factors data, such as user IDs (e.g., of the people or object) detected in the home during the time period. This may be via detection of personal electronic devices of the people present in the home (e.g., mobile device 101) or other sensors 105 (e.g., a camera). The object information may also include environmental factors data, such as data collected from sensors (e.g., temperature or humidity sensors on or near appliance 104). - With continued reference to
FIG. 2 , atstep 123, based on object information (e.g., information of step 122), determining a personality ofappliance 104. For example,server 107 may translate the usage factors data, interaction factors data, environmental factors data, or other object information into scores on a personality scale. This personality scale (e.g., a Myers Briggs-like model) may include various human personality traits, such as agreeableness, happiness, loneliness, extraversion, comfort, or integrity. The object information may be used to determine whereappliance 104 rates on a measured scale for these traits. In an example, based on thresholds for object information of Table 1, a personality as shown in Table 2 may be created for a period (e.g., current period). As time passes, default personalities may be predicted based on the object information, such as shown in Table 3. -
TABLE 2 Comfort Happiness Irritation Low Low High -
TABLE 3 Comfort Happiness Irritation Fall and Winter: Highly Summer: Low when John is Irritation: High Comfortable Home Spring: Moderately Other: High Comfortable Summer: Uncomfortable - For instance,
server 107 might process object information to determine that during the summer,appliance 104 runs in a hot and humid environment andappliance 104 also detects that this is when Susie is home from college (e.g., mobile device 102) and takes long showers. Therefore,appliance 104 may be represented by default with a personality of unhappy and uncomfortable during the summer. This summer personality information may change over the course of the summer based on the object information gathered over time. Moreover, the interactions ofappliance 104 with other products or things may affect its personality representation. For instance, appliance 104 (e.g., the water heater) may be old or running an old software version. Based on its age (e.g., lack of features),appliance 104 may be unable to implement energy efficient updated operating instructions that it receives, frommobile device 103, which is an interaction with another thing. Ifappliance 104 constantly receives messages with regard to do something (implementing the operating instructions) that it cannot,appliance 104 may “feel” irritated. This irritated personality may be associated with a threshold time frame that the updated operating instructions were received and a threshold number of messages associated with the updated operating instructions. - With continued reference to step 123 of
FIG. 2 , a personality scale may use a well-known human personality model to identify and assign the personality ofappliance 104. Other information such as the age ofappliance 104 or a history of location ofappliance 104 might be used to further the personality representation ofappliance 104. - At
step 124, based on the determined personality ofstep 123, an action may be taken associated with the personality. An example action isappliance 104 may be assigned an avatar whose personality is representative of the data. For example, if the personality corresponds to Table 2, then an avatar that resembled a grumpy man or woman may be assigned. From there, this personality representation may be presented on a display on mobile device 101 (e.g., a smartphone or augmented reality device) to a user along with a link to additional information that may summarize significant object information that contributes to the determined personality representation. Significant object information (e.g., top 3 factors) may include usage factors data, interaction factors data, or environmental factors data. - In another example,
appliance 104 may be assigned a voice type to be used for text-to-speech communications. The voice may already be associated with the avatar or separately assigned. The voice assigned toappliance 104 might, for example, be a grumpy old man or irritated toddler and may be used when reporting a requested status ofappliance 104. The status may be requested via mobile device 101 (e.g., smart speaker, smartphone, augmented reality goggles, etc.). - In another example action, a social media post associated with the personality of
appliance 104 may be automatically posted. Ifappliance 104 is particularly happy for an extended period of time (e.g., a threshold amount), then a social media post may show an avatar or meme that is indicative of the happy personality. In another example,appliance 104 may initiate a communication with the user ofmobile device 101. A dishwasher may call (or otherwise alert)mobile device 101 to tell the user that it was not turned on by co-opting human communication channels (e.g., a phone or video chat). - Another example action may be an increase or decrease in the number of alerts or other interactions with regard to
appliance 104, because of the personality ofappliance 104. Ifappliance 104 is determined to have an introvert personality then less proactive alerts may be provided aboutappliance 104's status. Ifappliance 104 is an extrovert, then more proactive alerts may be provided. The proximity of a user may determine whether an alert is sent or not. If the user is proximate (e.g., near)appliance 104, an alert may be sent. - There are multiple scenarios that personalities for inanimate object may be used. Additional examples and perspective are disclosed below.
- In a scenario,
appliance 104 may be relatively old and out of date.Appliance 104 may receive instructions on energy efficiency operation from a home network, but fail to implement them. Based on an assigned personality,appliance 104 may be represented as an avatar (or meme) indicative of “Old Dog Not Able to Learn New Tricks.” In another example,appliance 104 may have connected with a large number of Wi-Fi networks in different countries. Based on an assigned personality,appliance 104 may be represented as an avatar (or meme) indicative of a “World Traveler that is Open to New Connections.” In another example, an avatar associated withappliance 104 may be have an age progression or facial change based on time or amount of use. - In a scenario an inanimate object may be a bike 106 (e.g., connected exercise bike). In this case, object information might indicate
bike 106 is used only once per month on average and is relatively new. Also, object information may indicate that other exercise bikes may have tried to invite bike 106 (e.g., associated with other end users) to participate in group exercise sessions, butbike 106 has not replied. In this case,bike 106 may have a personality associated with a young, lonely introvert.Bike 106 may send an alert to ask the user if it should list itself for sale online or automatically list itself for sale, which may be via social media. - In another scenario an inanimate object may be a security camera (e.g., sensor 105) and may have an avatar that is indicative of a professional police officer. The avatar may be based on
sensor 105 regularly (e.g., every week or every month) sending video clips to a local police department that has been indicated as useful and high quality. - There may be a default personality (e.g., preinstalled personality) and accompanying avatar that a manufacturer or seller of an inanimate object may associate with the inanimate object. The default personality may be representative of a company or product. For example, a
certain model bike 106 may be considered an introvert until it adds one or more features (e.g., physical or digital) tobike 106, such as a designated handle bar or a designated genre of music. In another example, a car may come with a pre-installed personality, which may change over time based on object information associated with how a user drives the car. The object information may include high speeds, sudden stops, or the number of yellow lights run, among other things. - Usage factors or other object information may be associated with more than one user to create an aggregate personality that reflects shared usage. For instance, a home may be represented based on aggregate interactions of multiple users (e.g., persons or other inanimate objects) with many of the inanimate objects within the home. A personality may be transferred from one thing to another. For instance, when buying a new car, the previous car's personality may be transplanted into it.
- An inanimate object may have a personality that changes over time and have a corresponding avatar that changes its appearance based on its current personality. For instance,
purse 109 may be associated with a user and may be connected with network 108 (e.g., an integrated mobile device or a smartphone that has a virtual assistant application running on it). The purse may detect interaction with other users via a virtual assistant application that uses speech recognition. For instance, if the virtual assistant application detects speech around it that is complimentary of the purse, it may collect this as object information (e.g., interaction factor data) and send it toserver 107.Server 107 may note this as a trend in increasing confidence forpurse 109. Based on the object information,purse 109 may change its appearance to illuminate lights, electronically change the visual appearance of the fabric of purse 109 (e.g., electrical stimulus that changes temperature and color of the fabric), or make other changes. - In another scenario, the user may train the inanimate object personality via a speech interface (e.g., a virtual assistant). For instance, if the
bike 106 is communicating that it is lonely, the user may “reassure it” by scheduling a workout time on an electronic calendar. Based on this interaction the personality ofbike 106 may change and actions may be taken in response to such personality change, as disclosed herein. In a similar scenario, an electronic calendar may include events that are associated with an inanimate object. For example, there may be a calendar event to run in the morning each day. If the calendar event is not performed at all or below a threshold, then a user's shoes may be assigned a personality of loneliness or irritation. Subsequently an alert may be provided to a display technology (e.g., augmented reality) that shows a message, a change in avatar appearance or voice, or the like. Other events connected with inanimate object are contemplated such as ridingbike 106, washing car, washing clothes, cleaning home, reading a book, or cooking (e.g., stove may be the inanimate object), among other things. - In another scenario,
server 107 might detect that an inanimate object (e.g., mobile device 103) is being misused. For instance, via an imbedded accelerometer inmobile device 103, an impact bymobile device 103 may be indicated asmobile device 103 being slammed. Based on this detection,mobile device 103 may actively participate in its own self-preservation and a personality indicative of despair (e.g., an emergency) may be assigned. Based on the personality an action may be taken such as reporting the misuse to an appropriate entity. In another example, misuse may be an attempted software hack that themobile device 103 detects upon itself, or a detection that the personalities of other things that it interacts (and shares data with} are of low integrity. - In another scenario, augmented reality or virtual reality goggles may be used. For example, a person may be able to walk around their home or business and easily understand the personality (or mood) of the inanimate objects and quickly address any issues that may provide a real-world performance change in the object or some other significant, but less direct benefit. For example, based on the personality of the inanimate object, a real-world change such as tightening a screw, releasing a valve, or reducing usage may be performed. This may positively affect the performance of the inanimate object. With regard to an indirect benefit to performance, in an example, an inanimate object may have a personality comprising loneliness which could be easily observed using augmented reality. Based on recognition of this loneliness personality by a person (or another inanimate object) a book may be picked up, a couch may be sat on, or rarely used
bike 106 may be used to decrease the loneliness of the inanimate object, but not necessarily increase or decrease the performance of the inanimate object directly. A change may be made in the personality data, but also a change may be made in the person's daily habits, individual performance, or upkeep of a home. The indirect benefit may assist in discovering possibly forgotten inanimate objects in the home, which may cause for more efficient use of the home, keeping a person more physically fit, or efficient use of the person's resources, such as automatically posting the book for sale online or reading it for the first time and not purchasing another book to read. In another example, an inanimate object (e.g., a smartphone) may have a personality that indicates it is spoiled (e.g., grandiose or narcissistic), when it consistently overused. - It is contemplated that one or more devices may participate in the methods disclosed herein. Some steps may be distributed over a plurality of devices, such as
server 107,mobile device 101,appliance 104,sensor 105, orbike 106. It is also contemplated that the actions, object information, or the like may be displayed on a mobile phone screen, computer screen, virtual reality screen, or augmented reality screen (glasses or mobile phone), among other display technologies. - Conventionally, personality is often broken into statistically-identified components. These components are generally stable over time, and about half of the variance may be attributable to a person's genetics and the other half based on effects of one's environment. With this context in mind, the disclosed subject matter may be reflected in a similar way with an inanimate object, in which a first amount (e.g., half) of the inanimate object's personality may be attributable to the inanimate object's model or make (e.g., default configuration by manufacturer or seller) and a second amount of the inanimate object's personality may be attributable to external factors (e.g., usage factors, interaction factors, environment factors, etc.).
- Disclosed herein are methods, systems, and apparatuses for assigning, modifying, or otherwise using personality traits for inanimate objects. This allows for the users of these objects to make efficient use of these objects and tailor them to be used in such a way that efficiently meets their needs.
-
FIG. 3 is a block diagram ofnetwork device 300 that may be connected to or comprise a component ofsystem 100.Network device 300 may comprise hardware or a combination of hardware and software. The functionality to facilitate telecommunications via a telecommunications network may reside in one or combination ofnetwork devices 300.Network device 300 depicted inFIG. 3 may represent or perform functionality of anappropriate network device 300, or combination ofnetwork devices 300, such as, for example, a component or various components of a cellular broadcast system wireless network, a processor, a server, a gateway, a node, a mobile switching center (MSC), a short message service center (SMSC), an automatic location function server (ALFS), a gateway mobile location center (GMLC), a radio access network (RAN), a serving mobile location center (SMLC), or the like, or any appropriate combination thereof. It is emphasized that the block diagram depicted inFIG. 3 is exemplary and not intended to imply a limitation to a specific implementation or configuration. Thus,network device 300 may be implemented in a single device or multiple devices (e.g., single server or multiple servers, single gateway or multiple gateways, single controller or multiple controllers). Multiple network entities may be distributed or centrally located. Multiple network entities may communicate wirelessly, via hard wire, or any appropriate combination thereof. -
Network device 300 may comprise aprocessor 302 and amemory 304 coupled toprocessor 302.Memory 304 may contain executable instructions that, when executed byprocessor 302,cause processor 302 to effectuate operations associated with mapping wireless signal strength. As evident from the description herein,network device 300 is not to be construed as software per se. - In addition to
processor 302 andmemory 304,network device 300 may include an input/output system 306.Processor 302,memory 304, and input/output system 306 may be coupled together (coupling not shown inFIG. 3 ) to allow communications between them. Each portion ofnetwork device 300 may comprise circuitry for performing functions associated with each respective portion. Thus, each portion may comprise hardware, or a combination of hardware and software. Accordingly, each portion ofnetwork device 300 is not to be construed as software per se. Input/output system 306 may be capable of receiving or providing information from or to a communications device or other network entities configured for telecommunications. For example, input/output system 306 may include a wireless communications (e.g., 3G/4G/GPS) card. Input/output system 306 may be capable of receiving or sending video information, audio information, control information, image information, data, or any combination thereof. Input/output system 306 may be capable of transferring information withnetwork device 300. In various configurations, input/output system 306 may receive or provide information via any appropriate means, such as, for example, optical means (e.g., infrared), electromagnetic means (e.g., RF, Wi-Fi, Bluetooth®, ZigBee®), acoustic means (e.g., speaker, microphone, ultrasonic receiver, ultrasonic transmitter), or a combination thereof. In an example configuration, input/output system 306 may comprise a Wi-Fi finder, a two-way GPS chipset or equivalent, or the like, or a combination thereof. - Input/
output system 306 ofnetwork device 300 also may contain acommunication connection 308 that allowsnetwork device 300 to communicate with other devices, network entities, or the like.Communication connection 308 may comprise communication media. Communication media typically embody computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, or wireless media such as acoustic, RF, infrared, or other wireless media. The term computer-readable media as used herein includes both storage media and communication media. Input/output system 306 also may include aninput device 310 such as keyboard, mouse, pen, voice input device, or touch input device. Input/output system 306 may also include anoutput device 312, such as a display, speakers, or a printer. -
Processor 302 may be capable of performing functions associated with telecommunications, such as functions for processing broadcast messages, as described herein. For example,processor 302 may be capable of, in conjunction with any other portion ofnetwork device 300, determining a type of broadcast message and acting according to the broadcast message type or content, as described herein. -
Memory 304 ofnetwork device 300 may comprise a storage medium having a concrete, tangible, physical structure. As is known, a signal does not have a concrete, tangible, physical structure.Memory 304, as well as any computer-readable storage medium described herein, is not to be construed as a signal.Memory 304, as well as any computer-readable storage medium described herein, is not to be construed as a transient signal.Memory 304, as well as any computer-readable storage medium described herein, is not to be construed as a propagating signal.Memory 304, as well as any computer-readable storage medium described herein, is to be construed as an article of manufacture. -
Memory 304 may store any information utilized in conjunction with telecommunications. Depending upon the exact configuration or type of processor,memory 304 may include a volatile storage 314 (such as some types of RAM), a nonvolatile storage 316 (such as ROM, flash memory), or a combination thereof.Memory 304 may include additional storage (e.g., aremovable storage 318 or a non-removable storage 320) including, for example, tape, flash memory, smart cards, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, USB-compatible memory, or any other medium that can be used to store information and that can be accessed bynetwork device 300.Memory 304 may comprise executable instructions that, when executed byprocessor 302,cause processor 302 to effectuate operations to map signal strengths in an area of interest. -
FIG. 4 depicts an exemplary diagrammatic representation of a machine in the form of acomputer system 500 within which a set of instructions, when executed, may cause the machine to perform any one or more of the methods described above. One or more instances of the machine can operate, for example, asprocessor 302,mobile device 101,mobile device 102,mobile device 103,appliance 104,sensor 105, and other devices ofFIG. 1 . In some embodiments, the machine may be connected (e.g., using a network 502) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client user machine in a server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. - The machine may comprise a server computer, a client user computer, a personal computer (PC), a tablet, a smart phone, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. It will be understood that a communication device of the subject disclosure includes broadly any electronic device that provides voice, video or data communication. Further, while a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methods discussed herein.
-
Computer system 500 may include a processor (or controller) 504 (e.g., a central processing unit (CPU)), a graphics processing unit (GPU, or both), amain memory 506 and astatic memory 508, which communicate with each other via abus 510. Thecomputer system 500 may further include a display unit 512 (e.g., a liquid crystal display (LCD), a flat panel, or a solid state display).Computer system 500 may include an input device 514 (e.g., a keyboard), a cursor control device 516 (e.g., a mouse), adisk drive unit 518, a signal generation device 520 (e.g., a speaker or remote control) and anetwork interface device 522. In distributed environments, the embodiments described in the subject disclosure can be adapted to utilizemultiple display units 512 controlled by two ormore computer systems 500. In this configuration, presentations described by the subject disclosure may in part be shown in a first ofdisplay units 512, while the remaining portion is presented in a second ofdisplay units 512. - The
disk drive unit 518 may include a tangible computer-readable storage medium 524 on which is stored one or more sets of instructions (e.g., software 526) embodying any one or more of the methods or functions described herein, including those methods illustrated above. Instructions 526 may also reside, completely or at least partially, withinmain memory 506,static memory 508, or withinprocessor 504 during execution thereof by thecomputer system 500.Main memory 506 andprocessor 504 also may constitute tangible computer-readable storage media. - While examples of a telecommunications system in which systems and methods for applied machine cognition can be processed and managed have been described in connection with various computing devices/processors, the underlying concepts may be applied to any computing device, processor, or system capable of facilitating a telecommunications system. The various techniques described herein may be implemented in connection with hardware or software or, where appropriate, with a combination of both. Thus, the methods and devices may take the form of program code (i.e., instructions) embodied in concrete, tangible, storage media having a concrete, tangible, physical structure. Examples of tangible storage media include floppy diskettes, CD-ROMs, DVDs, hard drives, or any other tangible machine-readable storage medium (computer-readable storage medium). Thus, a computer-readable storage medium is not a signal. A computer-readable storage medium is not a transient signal. Further, a computer-readable storage medium is not a propagating signal. A computer-readable storage medium as described herein is an article of manufacture. When the program code is loaded into and executed by a machine, such as a computer, the machine becomes a device for telecommunications. In the case of program code execution on programmable computers, the computing device will generally include a processor, a storage medium readable by the processor (including volatile or nonvolatile memory or storage elements), at least one input device, and at least one output device. The program(s) can be implemented in assembly or machine language, if desired. The language can be a compiled or interpreted language, and may be combined with hardware implementations.
- The methods and devices associated with a telecommunications system as described herein also may be practiced via communications embodied in the form of program code that is transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as an EPROM, a gate array, a programmable logic device (PLD), a client computer, or the like, the machine becomes a device for implementing telecommunications as described herein. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique device that operates to invoke the functionality of a telecommunications system.
- While a telecommunications system has been described in connection with the various examples of the various figures, it is to be understood that other similar implementations may be used or modifications and additions may be made to the described examples of a telecommunications system without deviating therefrom. For example, one skilled in the art will recognize that a telecommunications system as described in the instant application may apply to any environment, whether wired or wireless, and may be applied to any number of such devices connected via a communications network and interacting across the network. Therefore, a telecommunications system as described herein should not be limited to any single example, but rather should be construed in breadth and scope in accordance with the appended claims.
- In describing preferred methods, systems, or apparatuses of the subject matter of the present disclosure—systems and methods for applied machine cognition—as illustrated in the Figures, specific terminology is employed for the sake of clarity. In addition, the use of the word “or” is generally used inclusively unless otherwise provided herein.
- This written description uses examples to enable any person skilled in the art to practice the claimed subject matter, including making and using any devices or systems and performing any incorporated methods. Other variations of the examples are contemplated herein. An inanimate object is considered anything that lacks consciousness (e.g., a lawn or an apple). With that said, it is contemplated herein that the techniques used for inanimate objects may be used to assign personalities to people (e.g., through observing activities through a camera and a mobile device of the person).
- Methods, systems, and apparatuses, among other things, as described herein may provide for obtaining first information about an object over a first period; based on the obtaining the first information about the object, assigning a first avatar to the object; obtaining second information about the object over a second period; and based on the obtaining the second information about the object, assigning a second avatar to the object. Methods, systems, and apparatuses, among other things, as described herein may provide for obtaining first information about an object over a first period; comparing the first information to usage factors, interaction factors, or environmental factors that are linked to thresholds for a personality; based on the comparing, assigning the personality to the object; and based on the personality of the object, assigning an avatar to the object. An apparatus may assign a personality to the object. The assigning the personality to the object may comprise assigning an avatar to the object, and the operations further comprising providing instructions to display the avatar via augmented reality The method, system, computer readable storage medium, or apparatus may, based on the personality of the object, assign audio to the object; automatically post information associated with the personality of the object to social media; send a request to post information about the object to social media; send a reduced number of alerts associated with the object when the personality is introversion; or display the avatar via augmented reality. The method, system, computer readable storage medium, or apparatus may, based on the personality of the object, assign audio to the object. The first information about the object over the first period may be from a camera in proximity. The method, system, computer readable storage medium, or apparatus obtaining first information about an object over a first period; comparing the first information to usage factors, interaction factors, or environmental factors that define personalities; and based on the comparing, assigning a personality to the object. The method, system, computer readable storage medium, or apparatus automatically posting information (e.g., social media, SMS text, web page, or other alert for display) associated with the personality of the object. All combinations in this paragraph (including the removal or addition of steps) are contemplated in a manner that is consistent with the other portions of the detailed description.
Claims (20)
1. An apparatus comprising:
a processing system including a processor; and
a memory that stores executable instructions that, when executed by the processing system, facilitate performance of operations, the operations comprising:
obtaining first information about an object over a first period wherein the first information comprises information generated by the object, the first information comprising information about usage of the object or information about usage of resources by the object;
obtaining second information about the object over the first period, wherein the second information comprises interactions of the object with one or more additional inanimate objects, wherein the second information comprises information about an object personality of a particular object of the one or more additional inanimate objects with which the object interacts;
obtaining third information about the object over the first period, wherein the third information is generated by a sensor in proximity to the object;
scoring the first information, second information and third information on a personality scale; and
assigning a personality to the object, based on the scoring.
2. The apparatus of claim 1 , wherein the assigning the personality to the object comprises:
assigning an avatar to the object, wherein the assigning is based on the personality, and wherein the operations further comprise providing instructions to display the avatar via augmented reality.
3. The apparatus of claim 1 , wherein the operations further comprise:
automatically posting, in a social media post, information associated with the personality of the object, wherein the posting is based on the personality of the object.
4. The apparatus of claim 3 , wherein the operations further comprise:
selecting an avatar for the object, wherein the selecting is based on the personality; and
including the avatar in the social media post.
5. The apparatus of claim 1 , wherein the operations further comprise:
assigning an audio feature to the object, wherein the assigning the audio feature is based on the personality of the object.
6. The apparatus of claim 5 , wherein the operations further comprise:
selecting an avatar for the object, wherein the selecting is based on the personality; and
assigning the audio feature as a voice of the avatar.
7. The apparatus of claim 1 , wherein the operations further comprise:
sending a request to post information about the object to social media, wherein the sending is based on the personality of the object.
8. The apparatus of claim 1 , wherein the operations further comprise:
comparing the first information to usage factors;
comparing the second information to interaction factors;
comparing the third information to environmental factors; and
wherein the assigning the personality to the object is based on the comparing.
9. A non-transitory machine-readable medium, comprising executable instructions that, when executed by a processing system including a processor, facilitate performance of operations, the operations comprising:
obtaining first information about an object over a first period wherein the first information comprises information generated by the object including information about a usage of the object or information about usage of resources by the object during a usage of the object, or a combination of these;
obtaining second information about the object over the first period, wherein the second information comprises information about interactions of the object with one or more additional inanimate objects;
obtaining third information about the object over the first period, wherein the third information is generated by a sensor in proximity to the object;
scoring the first information, second information and third information on a personality scale; and
based on the scoring, assigning a personality to the object.
10. The non-transitory machine-readable medium of claim 9 , wherein the operations further comprise:
comparing the first information to usage factors;
comparing the second information to interaction factors;
comparing the third information to environmental factors; and
wherein the assigning the personality to the object is based on the comparing.
11. The non-transitory machine-readable medium of claim 9 , wherein the operations further comprise:
sending alerts associated with the object.
12. The non-transitory machine-readable medium of claim 9 , wherein the operations further comprise:
determining the personality comprises introversion; and
based on the personality, sending a reduced number of alerts associated with the object.
13. The non-transitory machine-readable medium of claim 9 , wherein the operations further comprise:
assigning an audio feature to the object, wherein the assigning the audio feature is based on the personality of the object.
14. The non-transitory machine-readable medium of claim 13 , wherein the operations further comprise:
assigning the audio feature as an alert for the object, wherein the alert comprises the audio feature.
15. The non-transitory machine-readable medium of claim 14 , wherein the operations further comprise:
triggering the alert based on proximity of the object to another predetermined object.
16. A system comprising:
a mobile device; and
a network device in data communication with the mobile device, the network device comprising:
a processing system including a processor; and
a memory that stores executable instructions that, when executed by the processing system, facilitate performance of operations, the operations comprising:
receiving, at the network device, registration information from an object to be assigned a personality;
creating, by the network device, a profile for the object, the profile comprising descriptive information about the object, the descriptive information comprising identification information and manufacture information;
obtaining first information about the object over a first period wherein the first information comprises information generated by the object, the first information comprising information about a usage of the object or information about a usage of resources by the object;
obtaining second information about the object over the first period, wherein the second information comprises information about interactions of the object with one or more additional inanimate objects;
obtaining third information about the object over the first period, wherein the third information is generated by a sensor in proximity to the object;
comparing the first information to usage factors;
comparing the second information to interaction factors;
comparing the third information to environmental factors that are linked to personality thresholds;
based on the comparing, scoring the first information, the second information and the third information on a personality scale;
based on the scoring, assigning a personality to the object; and
based on the personality of the object, assigning an avatar to the object according to the profile.
17. The system of claim 16 , wherein the obtaining the first information about the object comprises:
obtaining information comprising information about events detected by the object during an operation of the object.
18. The system of claim 16 , wherein the obtaining the second information about the object comprises:
obtaining information about an object personality of a particular object of the one or more additional inanimate objects with which the object interacts.
19. The system of claim 16 , wherein the operations further comprise:
assigning an audio feature to the object, wherein the assigning the audio feature is based on the personality of the object.
20. The system of claim 19 , wherein the operations further comprise:
selecting an avatar for the object, wherein the selecting is based on the personality; and
assigning the audio feature as a voice of the avatar.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/833,367 US20220309787A1 (en) | 2019-12-12 | 2022-06-06 | Systems and methods for applied machine cognition |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/712,274 US11380094B2 (en) | 2019-12-12 | 2019-12-12 | Systems and methods for applied machine cognition |
US17/833,367 US20220309787A1 (en) | 2019-12-12 | 2022-06-06 | Systems and methods for applied machine cognition |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/712,274 Continuation US11380094B2 (en) | 2019-12-12 | 2019-12-12 | Systems and methods for applied machine cognition |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220309787A1 true US20220309787A1 (en) | 2022-09-29 |
Family
ID=76317607
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/712,274 Active US11380094B2 (en) | 2019-12-12 | 2019-12-12 | Systems and methods for applied machine cognition |
US17/833,367 Abandoned US20220309787A1 (en) | 2019-12-12 | 2022-06-06 | Systems and methods for applied machine cognition |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/712,274 Active US11380094B2 (en) | 2019-12-12 | 2019-12-12 | Systems and methods for applied machine cognition |
Country Status (1)
Country | Link |
---|---|
US (2) | US11380094B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4182920A4 (en) * | 2020-10-30 | 2023-12-27 | Samsung Electronics Co., Ltd. | Method and system for assigning unique voice for electronic device |
TW202226002A (en) * | 2020-12-15 | 2022-07-01 | 萬達人工智慧科技股份有限公司 | Device and method for generating an electronic card |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090322547A1 (en) * | 2008-06-30 | 2009-12-31 | Hon Hai Precision Industry Co., Ltd. | Computer alert system and method for object proximity |
RU2678361C1 (en) * | 2014-01-24 | 2019-01-28 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Audio navigation assistance |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5072998A (en) | 1989-08-08 | 1991-12-17 | Del Cerro Investment Group, Inc. | Stuffed anatomical members |
US5982390A (en) | 1996-03-25 | 1999-11-09 | Stan Stoneking | Controlling personality manifestations by objects in a computer-assisted animation environment |
IL120855A0 (en) | 1997-05-19 | 1997-09-30 | Creator Ltd | Apparatus and methods for controlling household appliances |
US6230111B1 (en) * | 1998-08-06 | 2001-05-08 | Yamaha Hatsudoki Kabushiki Kaisha | Control system for controlling object using pseudo-emotions and pseudo-personality generated in the object |
US6526395B1 (en) | 1999-12-31 | 2003-02-25 | Intel Corporation | Application of personality models and interaction with synthetic characters in a computing system |
US6922685B2 (en) * | 2000-05-22 | 2005-07-26 | Mci, Inc. | Method and system for managing partitioned data resources |
US20050159955A1 (en) | 2002-05-14 | 2005-07-21 | Martin Oerder | Dialog control for an electric apparatus |
US20060282315A1 (en) | 2005-06-13 | 2006-12-14 | 804935 Ontario Ltd. | Method of marketing and merchandising plants |
US7901288B2 (en) * | 2006-04-20 | 2011-03-08 | International Business Machines Corporation | Embedded advertising enhancements in interactive computer game environments |
WO2008000046A1 (en) | 2006-06-29 | 2008-01-03 | Relevancenow Pty Limited | Social intelligence |
US8131549B2 (en) | 2007-05-24 | 2012-03-06 | Microsoft Corporation | Personality-based device |
USH2253H1 (en) | 2008-06-26 | 2011-05-03 | Pixar | Multiple personality articulation for animated characters |
US20110212428A1 (en) | 2010-02-18 | 2011-09-01 | David Victor Baker | System for Training |
US8898233B2 (en) | 2010-04-23 | 2014-11-25 | Ganz | Matchmaking system for virtual social environment |
US20120176525A1 (en) * | 2011-01-12 | 2012-07-12 | Qualcomm Incorporated | Non-map-based mobile interface |
US8996429B1 (en) | 2011-05-06 | 2015-03-31 | Google Inc. | Methods and systems for robot personality development |
US20130123583A1 (en) * | 2011-11-10 | 2013-05-16 | Erica L. Hill | System and method for analyzing digital media preferences to generate a personality profile |
US20130260850A1 (en) | 2012-03-30 | 2013-10-03 | Jogonaut | Mixed Reality Role Playing Games |
US9489679B2 (en) | 2012-10-22 | 2016-11-08 | Douglas E. Mays | System and method for an interactive query utilizing a simulated personality |
US9304652B1 (en) | 2012-12-21 | 2016-04-05 | Intellifect Incorporated | Enhanced system and method for providing a virtual space |
US20140279288A1 (en) * | 2013-03-15 | 2014-09-18 | Suzanne Small WOUK | Method and system for data aggregation and diffusion |
US9727798B2 (en) | 2014-07-18 | 2017-08-08 | Acrovirt, LLC | Generating and using a predictive virtual personification |
US9996874B2 (en) | 2014-09-11 | 2018-06-12 | Oracle International Corporation | Character personal shopper system |
WO2016113967A1 (en) * | 2015-01-14 | 2016-07-21 | ソニー株式会社 | Information processing system, and control method |
WO2016187477A1 (en) * | 2015-05-20 | 2016-11-24 | Daqri, Llc | Virtual personification for augmented reality system |
CN109416701A (en) | 2016-04-26 | 2019-03-01 | 泰康机器人公司 | The robot of a variety of interactive personalities |
US10228773B2 (en) | 2017-01-02 | 2019-03-12 | Merge Labs, Inc. | Three-dimensional augmented reality object user interface functions |
US9942356B1 (en) | 2017-02-24 | 2018-04-10 | Spotify Ab | Methods and systems for personalizing user experience based on personality traits |
US11126848B2 (en) * | 2017-11-20 | 2021-09-21 | Rakuten Group, Inc. | Information processing device, information processing method, and information processing program |
US10326726B1 (en) * | 2017-12-01 | 2019-06-18 | International Business Machines Corporation | Alert modification based on social media activity |
WO2019124850A1 (en) | 2017-12-20 | 2019-06-27 | 네이버랩스 주식회사 | Method and system for personifying and interacting with object |
US10877999B2 (en) | 2017-12-21 | 2020-12-29 | Micron Technology, Inc. | Programmatically identifying a personality of an autonomous vehicle |
US11301746B2 (en) * | 2017-12-30 | 2022-04-12 | Graphen, Inc. | Persona-driven and artificially-intelligent avatar |
-
2019
- 2019-12-12 US US16/712,274 patent/US11380094B2/en active Active
-
2022
- 2022-06-06 US US17/833,367 patent/US20220309787A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090322547A1 (en) * | 2008-06-30 | 2009-12-31 | Hon Hai Precision Industry Co., Ltd. | Computer alert system and method for object proximity |
RU2678361C1 (en) * | 2014-01-24 | 2019-01-28 | МАЙКРОСОФТ ТЕКНОЛОДЖИ ЛАЙСЕНСИНГ, ЭлЭлСи | Audio navigation assistance |
Also Published As
Publication number | Publication date |
---|---|
US20210182557A1 (en) | 2021-06-17 |
US11380094B2 (en) | 2022-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220309787A1 (en) | Systems and methods for applied machine cognition | |
EP3278291B1 (en) | Inferring user sleep patterns | |
US10389873B2 (en) | Electronic device for outputting message and method for controlling the same | |
CN106357517B (en) | Directional label generation method and device | |
CN105144135B (en) | Content stream is filled on the mobile apparatus | |
CN102986201B (en) | User interfaces | |
US20170243465A1 (en) | Contextual notification engine | |
EP3740822B1 (en) | Optimization of an automation setting through selective feedback | |
US20170178048A1 (en) | Identification and presentation of tasks based on predicted periods of user availability | |
CN107548500A (en) | Event anomalies based on user's routine model | |
CN107924506A (en) | Infer the user availability of communication and set based on user availability or context changes notice | |
CN107683486A (en) | The change with personal influence of customer incident | |
US10171472B2 (en) | Role-specific service customization | |
US20200380968A1 (en) | Voice response interfacing with multiple smart devices of different types | |
CN108140149A (en) | Role Specific Device Behavior | |
US11436265B2 (en) | System for presenting tailored content based on user sensibilities | |
CN113454669A (en) | Characterizing a place by user visited features | |
KR20160000446A (en) | System for identifying human relationships around users and coaching based on identified human relationships | |
CN109313588B (en) | Signal upload optimization | |
JP6698575B2 (en) | Recommendation system and recommendation method | |
JP6242359B2 (en) | Information processing apparatus and method | |
US20140129955A1 (en) | Information processing system, information processing method, information processing device, information processing terminal, program and storage medium | |
US20210012377A1 (en) | Personalized identification of visit start | |
US20150277683A1 (en) | Adaptive user experience | |
CN113965541B (en) | Conversation expression processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: AT&T INTELLECTUAL PROPERTY I, L.P., GEORGIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KOCH, ROBERT;MARATHE, NIKHIL;PRATT, JAMES;AND OTHERS;SIGNING DATES FROM 20191122 TO 20191203;REEL/FRAME:060195/0130 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |