US20180329375A1 - Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System - Google Patents
Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System Download PDFInfo
- Publication number
- US20180329375A1 US20180329375A1 US15/823,404 US201715823404A US2018329375A1 US 20180329375 A1 US20180329375 A1 US 20180329375A1 US 201715823404 A US201715823404 A US 201715823404A US 2018329375 A1 US2018329375 A1 US 2018329375A1
- Authority
- US
- United States
- Prior art keywords
- computing device
- shading
- assembly
- device housing
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000013473 artificial intelligence Methods 0.000 title description 284
- 230000015654 memory Effects 0.000 claims abstract description 78
- 230000007613 environmental effect Effects 0.000 claims abstract description 34
- 238000006243 chemical reaction Methods 0.000 claims abstract description 6
- 238000005259 measurement Methods 0.000 claims description 84
- 230000000712 assembly Effects 0.000 claims description 63
- 238000000429 assembly Methods 0.000 claims description 63
- 238000003384 imaging method Methods 0.000 claims description 42
- 238000012546 transfer Methods 0.000 claims description 17
- 238000004891 communication Methods 0.000 description 38
- 238000000034 method Methods 0.000 description 30
- 230000008569 process Effects 0.000 description 18
- 238000012545 processing Methods 0.000 description 17
- 238000003491 array Methods 0.000 description 16
- CNQCVBJFEGMYDW-UHFFFAOYSA-N lawrencium atom Chemical compound [Lr] CNQCVBJFEGMYDW-UHFFFAOYSA-N 0.000 description 15
- 230000004044 response Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 230000005855 radiation Effects 0.000 description 11
- 230000008901 benefit Effects 0.000 description 10
- 230000009471 action Effects 0.000 description 9
- 230000001413 cellular effect Effects 0.000 description 9
- 229920003023 plastic Polymers 0.000 description 9
- 238000001931 thermography Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000003909 pattern recognition Methods 0.000 description 5
- CBENFWSGALASAD-UHFFFAOYSA-N Ozone Chemical compound [O-][O+]=O CBENFWSGALASAD-UHFFFAOYSA-N 0.000 description 3
- 229910000831 Steel Inorganic materials 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000004744 fabric Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 239000010959 steel Substances 0.000 description 3
- 238000001757 thermogravimetry curve Methods 0.000 description 3
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 2
- CWYNVVGOOAEACU-UHFFFAOYSA-N Fe2+ Chemical compound [Fe+2] CWYNVVGOOAEACU-UHFFFAOYSA-N 0.000 description 2
- GQPLMRYTRLFLPF-UHFFFAOYSA-N Nitrous Oxide Chemical compound [O-][N+]#N GQPLMRYTRLFLPF-UHFFFAOYSA-N 0.000 description 2
- RAHZWNYVWXNFOC-UHFFFAOYSA-N Sulphur dioxide Chemical compound O=S=O RAHZWNYVWXNFOC-UHFFFAOYSA-N 0.000 description 2
- 238000009529 body temperature measurement Methods 0.000 description 2
- 229910002091 carbon monoxide Inorganic materials 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 240000001436 Antirrhinum majus Species 0.000 description 1
- 206010003402 Arthropod sting Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000037338 UVA radiation Effects 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 239000013566 allergen Substances 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003750 conditioning effect Effects 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000005672 electromagnetic field Effects 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 239000003292 glue Substances 0.000 description 1
- 239000008187 granular material Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000001272 nitrous oxide Substances 0.000 description 1
- 238000013021 overheating Methods 0.000 description 1
- 239000013618 particulate matter Substances 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 229920000638 styrene acrylonitrile Polymers 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000475 sunscreen effect Effects 0.000 description 1
- 239000000516 sunscreening agent Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 238000013022 venting Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F10/00—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins
- E04F10/02—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of flexible canopy materials, e.g. canvas ; Baldachins
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F10/00—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins
- E04F10/02—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of flexible canopy materials, e.g. canvas ; Baldachins
- E04F10/04—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of flexible canopy materials, e.g. canvas ; Baldachins with material fixed on sections of a collapsible frame especially Florentine blinds
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F10/00—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins
- E04F10/02—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of flexible canopy materials, e.g. canvas ; Baldachins
- E04F10/06—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of flexible canopy materials, e.g. canvas ; Baldachins comprising a roller-blind with means for holding the end away from a building
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F10/00—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins
- E04F10/08—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of a plurality of similar rigid parts, e.g. slabs, lamellae
- E04F10/10—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins of a plurality of similar rigid parts, e.g. slabs, lamellae collapsible or extensible; metallic Florentine blinds; awnings with movable parts such as louvres
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/28—Constructional details of speech recognition systems
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02S—GENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
- H02S20/00—Supporting structures for PV modules
- H02S20/30—Supporting structures being movable or adjustable, e.g. for angle adjustment
- H02S20/32—Supporting structures being movable or adjustable, e.g. for angle adjustment specially adapted for solar tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/18—Telephone sets specially adapted for use in ships, mines, or other places exposed to adverse environment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/21—Combinations with auxiliary equipment, e.g. with clocks or memoranda pads
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04F—FINISHING WORK ON BUILDINGS, e.g. STAIRS, FLOORS
- E04F10/00—Sunshades, e.g. Florentine blinds or jalousies; Outside screens; Awnings or baldachins
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04H—BUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
- E04H15/00—Tents or canopies, in general
- E04H15/02—Tents combined or specially associated with other devices
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04H—BUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
- E04H15/00—Tents or canopies, in general
- E04H15/28—Umbrella type tents
-
- E—FIXED CONSTRUCTIONS
- E04—BUILDING
- E04H—BUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
- E04H15/00—Tents or canopies, in general
- E04H15/32—Parts, components, construction details, accessories, interior equipment, specially adapted for tents, e.g. guy-line equipment, skirts, thresholds
- E04H15/58—Closures; Awnings; Sunshades
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/04—Inference or reasoning models
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02S—GENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
- H02S20/00—Supporting structures for PV modules
- H02S20/30—Supporting structures being movable or adjustable, e.g. for angle adjustment
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02S—GENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
- H02S40/00—Components or accessories in combination with PV modules, not provided for in groups H02S10/00 - H02S30/00
- H02S40/30—Electrical components
- H02S40/38—Energy storage means, e.g. batteries, structurally associated with PV modules
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02S—GENERATION OF ELECTRIC POWER BY CONVERSION OF INFRARED RADIATION, VISIBLE LIGHT OR ULTRAVIOLET LIGHT, e.g. USING PHOTOVOLTAIC [PV] MODULES
- H02S99/00—Subject matter not provided for in other groups of this subclass
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/02—Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
- H04W84/10—Small scale networks; Flat hierarchical networks
- H04W84/12—WLAN [Wireless Local Area Networks]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E10/00—Energy generation through renewable energy sources
- Y02E10/50—Photovoltaic [PV] energy
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02E—REDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
- Y02E70/00—Other energy conversion or management systems reducing GHG emissions
- Y02E70/30—Systems combining energy storage with energy generation of non-fossil origin
Abstract
An apparatus provides protection to a computing device housing. The computing device housing comprises one or more microphones, the one or more microphones to capture audio sounds; one or more processors; one or more memory modules; and computer-readable instructions stored in the one or more memory modules. The apparatus includes a support assembly connected to a top surface of the computing device housing and a shading assembly connected to an end of the support assembly, the shading assembly to provide shade from environmental conditions to the computing device housing. The computer-readable instructions are executed by the one or more processors to convert the captured audio sounds from the one or more microphones to one or more audio files and to communicate the one or more audio files, via the one or more wireless transceivers, to an external computing device, for voice recognition and conversion into command files.
Description
- This application claims priority to provisional application Ser. No. 62/505,910, filed May 13, 2017, entitled “Artificial Intelligence (AI) Computing Device with Shading System,” the disclosure of which is incorporated by reference.
- The subject matter disclosed herein relates to an artificial intelligence device or a computing device that comprises a housing and a shading system.
- Conventional artificial intelligence computing devices have limitations based on being utilized indoors. Indoor AI computing devices cannot operate in outdoor environments because they are not protected from environmental conditions such as wind, rain, sun and/or air quality factors (e.g., smoke, carbon monoxide, etc.). Accordingly, a need exists for AI computing devices that may be utilized in outdoor environments.
- Non-limiting and non-exhaustive aspects are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various figures unless otherwise specified.
-
FIG. 1 illustrates an apparatus including artificial intelligence device or computing device with shading system according to embodiments; -
FIG. 1A illustrates a block diagram of components utilized to provide power in an apparatus including an AI device or computing device and a shading system; -
FIG. 2 illustrates an apparatus including an AI device or computing device and shading system with an adjustable shading supports according to embodiments; -
FIG. 3 illustrates an apparatus including AI device or computing device and shading system with a hinging support assembly according to embodiments; -
FIG. 4 illustrates a microphone and/or LED array in an AI device housing or computing device housing according to embodiments; -
FIG. 5 illustrates a block and dataflow diagram of communications between an AI device or computing device and shading system according to embodiments; -
FIG. 6 illustrates a block diagram of components and assemblies for rotating an AI device housing or computing device housing body about a base assembly; -
FIG. 7A illustrates an apparatus including an AI Device or computing device with Shading System with a movable base assembly according to embodiments; -
FIG. 7B is a flowchart illustrating base assembly movement according to voice commands according to embodiments; -
FIG. 7C illustrates movement of a base assembly according to sensor measurements according to embodiments; and -
FIG. 7D illustrates movement of a base assembly utilizing a camera and/or pattern recognition and/or image processing according to embodiments; and -
FIG. 8 illustrates a computing device and/or electronic device according to embodiments. - In the following detailed description, numerous specific details are set forth to provide a thorough understanding of claimed subject matter. For purposes of explanation, specific numbers, systems and/or configurations are set forth, for example. However, it should be apparent to one skilled in the relevant art having benefit of this disclosure that claimed subject matter may be practiced without specific details.
- References throughout this specification to one implementation, an implementation, one embodiment, embodiments, an embodiment and/or the like means that a particular feature, structure, and/or characteristic described in connection with a particular implementation and/or embodiment is included in at least one implementation and/or embodiment of claimed subject matter. Thus, appearances of such phrases, for example, in various places throughout this specification are not necessarily intended to refer to the same implementation or to any one particular implementation described. Furthermore, it is to be understood that particular features, structures, and/or characteristics described are capable of being combined in various ways in one or more implementations and, therefore, are within intended claim scope, for example.
- It is typical to employ distributed computing approaches, where computing is allocated among computing devices, including one or more clients and/or one or more servers, via a computing and/or communications network, for example. A network may comprise two or more network devices and/or may couple network devices so that signal communications, such as between a server and a client device and/or other types of devices, including between wireless devices coupled via a wireless network, for example. A network may comprise two or more network and/or computing devices and/or may couple network and/or computing devices so that communications may be exchanged, such as between a server and a client device and/or other types of devices, including between wireless devices coupled via a wireless network, for example. In this context, the term network device refers to any device capable of communicating via and/or as part of a network and may comprise a computing device.
- Computing devices, mobile computing devices, and/or network devices capable of operating as a server, or otherwise, may include, as examples, rack-mounted servers, desktop computers, laptop computers, set top boxes, tablets, netbooks, smart phones, wearable devices, integrated devices combining two or more features of the foregoing devices, single-board computers, the like or any combination thereof. It is noted that the terms, server, servers, server device, server computing device, application server, cloud servers, server network devices, and/or similar terms are used interchangeably. Similarly, the terms client, client device, client computing device, clients, and/or similar terms are also used interchangeably. These terms may be used in the singular, such as by referring to a “client device” or a “server device,” and may be intended to encompass one or more client devices and/or one or more server devices, as appropriate. References to a “database” or “databases” are understood to mean, one or more databases, database servers, application data servers, database cloud servers, proxy servers, and/or portions thereof, as appropriate.
- Operations and/or processing, such as in association with networks, such as computing and/or communications networks, for example, may involve physical manipulations of physical quantities (electrical, magnetic and/or optical signals). These signals may be utilized as bits, data, values, elements, symbols, characters, terms, numbers, numerals and/or the like.
- Likewise, in this context, the terms “coupled”, “connected,” and/or similar terms are used generically. It should be understood that these terms are not intended as synonyms. Rather, “connected” is used generically to indicate that two or more components, for example, are in direct physical, including direct electrical, contact; while, “coupled” is used generically to mean that two or more components are potentially in direct physical, including electrical, contact; however, “coupled” is also used generically to also mean that two or more components are not necessarily in direct contact, but nonetheless are able to co-operate and/or interact (also indirectly coupled). The term “coupled” is also understood generically to mean indirectly connected, for example, in an appropriate context. If signals, messages and/or commands are transmitted from one component (or assembly to another component (or assembly), it is understood that messages, signals, instructions, and/or commands may be transmitted directly to a component, or may pass through a number of other components on a way to a destination component. For example, a signal transmitted from a motor controller to a motor (or other driving assembly) may pass through glue logic, an amplifier, an analog-to-digital converter, a digital-to-analog converter, another controller and/or processor, and/or an interface. Similarly, a signal communicated through a misting system may pass through an air conditioning and/or a cooing module, and a signal communicated from any one or a number of sensors to a controller and/or processor may pass through a conditioning module, an analog-to-digital controller, and/or a comparison module, and/or a number of other electrical assemblies and/or components.
- The terms, “and”, “or”, “and/or” and/or similar terms, as used herein, include a variety of meanings that also are expected to depend at least in part upon the particular context in which such terms are used. Typically, “or” if used to associate a list, such as A, B or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B or C, here used in the exclusive sense. In addition, the term “one or more” and/or similar terms is used to describe any feature, structure, and/or characteristic in the singular and/or is also used to describe a plurality and/or some other combination of features, structures and/or characteristics.
- Likewise, the term “based on,” “based, at least in part on,” and/or similar terms (e.g., based at least in part on) are understood as not necessarily intending to convey an exclusive set of factors, but to allow for existence of additional factors not necessarily expressly described. Claimed subject matter is not limited to these one or more illustrative examples; however, again, particular context of description and/or usage provides helpful guidance regarding inferences to be drawn.
- A network may also include for example, past, present and/or future large storage devices, such as network storage, cloud storage, storage networks, cloud storage, cloud server farms, and/or other forms of computing and/or device readable media, for example. A network may include a portion of the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, one or more personal area networks (PANs), wWANs, wireless type connections, one or more mesh networks, one or more cellular communication networks, other connections, or any combination thereof. Thus, a network may be worldwide in scope and/or extent.
- The Internet and/or a global communications network may refer to a decentralized global network of interoperable networks that comply with the Internet Protocol (IP). It is noted that there are several versions of the Internet Protocol. Here, the term Internet Protocol, IP, and/or similar terms, is intended to refer to any version, now known and/or later developed of the Internet Protocol. The Internet may include local area networks (LANs), wide area networks (WANs), wireless networks, and/or long haul public networks that, for example, may allow signal packets and/or frames to be communicated between LANs. The term World Wide Web (WWW or Web) and/or similar terms may also be used, although it refers to a part of the Internet that complies with the Hypertext Transfer Protocol (HTTP) or XML. It is likewise noted that in various places there is substitution of the term Internet with the term World Wide Web (Web′).
- Although claimed subject matter is not in particular limited in scope to the Internet and/or to the Web; nonetheless, the Internet and/or the Web may without limitation provide a useful example of an embodiment at least for purposes of illustration. As indicated, the Internet and/or the Web may comprise a worldwide system of interoperable networks, including interoperable devices within those networks. A HyperText Markup Language (“HTML”), Cascading Style Sheets (“CSS”) or Extensible Markup Language (“XML”), for example, may be utilized to specify content and/or to specify a format for hypermedia type content, such as in the form of a file and/or an “electronic document,” such as a Web page, for example. HTML and/or XML are merely example languages and are not intended to be limited to examples provided as illustrations, of course.
- One or more parameters may be descriptive of a collection of physical signals and/or physical states. For example, one or more parameters, such as referring to an electronic document comprising an image, may include parameters, such as 1) time of day at which an image was captured, latitude and longitude of an image capture device, such as a camera; 2) time and day of when a sensor reading (e.g., humidity, temperature, air quality, UV radiation) was received; and/or 3) operating conditions of one or more motors or other components or assemblies in a modular umbrella shading system. Claimed subject matter is intended to embrace meaningful, descriptive parameters in any format.
- Some portions of the detailed description which follow are presented in terms of algorithms or symbolic representations of operations on binary digital signals stored within a memory of a specific apparatus or special purpose computing device or platform. In the context of this particular specification, the term specific apparatus or the like includes a general purpose computer once it is programmed to perform particular functions pursuant to instructions from program software. In embodiments, a computing device may be installed within or as part of an artificial intelligence system having a shading element or structure. Algorithmic descriptions or symbolic representations are examples of techniques used by those of ordinary skill in the signal processing or related arts to convey the substance of their work to others skilled in the art. An algorithm is here, and generally, considered to be a self-consistent sequence of operations or similar signal processing leading to a desired result. In this context, operations or processing involve physical manipulation of physical quantities.
- Utilization of terms in the specification such as “processing,” “computing,” “calculating,” “determining” or the like may refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic computing device (e.g., such as an artificial intelligence computing device). In the context of this specification, therefore, a special purpose computer or a similar special purpose electronic computing device (e.g., an AI computing device) is capable of manipulating or transforming signals (electronic and/or magnetic) in memories (or components thereof), other storage devices, communication devices, transmission or transceiver devices, sound reproduction devices, and/or display devices.
- In an embodiment, a controller and/or a processor typically performs a series of instructions resulting in data manipulation. In an embodiment, a microcontroller or microprocessor may be a compact microcomputer designed to govern the operation of embedded systems in electronic devices, e.g., an AI computing device with a shading element and/or shading structure, an AI device with a shading element, and various other electronic and mechanical devices coupled thereto or installed thereon. Microcontrollers may include processors, microprocessors, and other electronic components. Controller may be a commercially available processor such as an Intel Pentium, Motorola PowerPC, SGI MIPS, Sun UltraSPARC, Qualcomm Snapdragon processor, or Hewlett-Packard PA-RISC processor, but may be any type of application-specific and/or specifically designed processor or controller. In an embodiment, a processor and/or controller may be connected to other system elements, including one or more memory devices, by a bus, a mesh network, serial communication networks, wireless networks or other mesh components. Usually, a processor or controller, may execute an operating system which may be, for example, a Windows-based operating system (Microsoft), a MAC OS System X operating system (Apple Computer), one of many open source operating system), a Solaris operating system (Sun), a portable electronic device operating system (e.g., mobile phone operating systems, iOS, Android, Microsoft Phone, etc.), microcomputer operating systems, single board computer operating systems, wearable device operating systems, and/or a UNIX operating systems. Embodiments are not limited to any particular implementation and/or operating system.
- The specification may refer to an artificial intelligence (AI) device or computing device having a shading element or structure as an apparatus that allows an operator or user to verbally or audibly interface with it and having a shading element and/or shading structure to provide shade and/or provide coverage to an AI device (and potentially an operator) from weather elements such as sun, wind, rain, and/or hail. In embodiments, a shading element and/or shading structure may further comprise solar cells and/or solar arrays to generate power for operation of the AI system with shading element. In embodiments, the shading element or shading structure may be a simple shading fabric, or a shading frame and shading fabric. In embodiments, the shading element or shading structure may be an automated and/or intelligent and may respond to commands, instructions and/or signals audibly spoken by a user/operator or generated by a processor upon execution of computer-readable instructions. The shading system, shading structure and/or shading element may be referred to as a parasol, an umbrella, a sun shade, sun screen, sun shelter, awning, sun cover, sun marquee, brolly and other similar names, which may all be utilized interchangeably in this application. These terms may be utilized interchangeably throughout the specification. In embodiments, a shading element or shading structure may be connected to an AI device housing via a shading support, central support assembly, a stem assembly, and/or tube.
-
FIG. 1 illustrates an apparatus including artificial intelligence device or computing device with shading system according to embodiments. An artificial intelligence (AI) device or computing device having a shading system may comprise a shading frame and/orfabric 103, ashading support assembly 105, and an AI device housing (or computing device housing) 108. In embodiments, an AI device may be referenced. Descriptions herein apply to either AI devices or computing devices. Although AI devices may be referenced, the features and functions described may apply to computing devices. In embodiments, an AI device may comprise voice recognition or other AI computer-readable instructions stored in a memory and executable by one or more processors. The computer-readable instructions may be a complete AI engine and/or an AI application programming interface. - In embodiments, a shading element or
shade 103 may provide shade to keep an AI shading device housing (or computing device housing) 108 from overheating and/or protect it from other environmental conditions (e.g., rain, sleet, snow, etc.). In embodiments, an AI device or computing device housing 108 may be coupled and/or connected to ashading support 105. In embodiments, a shading system may refer to one or more shading supports 104 and one or more shading elements or shades 103. In embodiments, ashading support 105 may be coupled to an AI device or computing device housing 108. In embodiments, ashading support 105 may support a shade orshading element 103 and move it into position with respect to an AI shading device housing 108. In this illustrative embodiment ofFIG. 1 , an AI shading device housing 108 may be utilized as a base, mount and/or support for a shading element orshade 103. In embodiments, ashading support 105 may be simple and may not have a tilting assembly and/or may not be adjustable. In embodiments, ashading support 105 may be simplified and not have many electronics, components and/or assemblies installed and/or positioned therein. In embodiments, ashading support 105 may also not include an expansion and sensor assembly. Illustratively, in embodiments, ashading support 105 may not comprise an integrated computing device, may not have lighting assemblies and/or may not have sensors installed therein and/or positioned thereon. In embodiments, a shading element orshade 103 or ashade support 105 may comprise one or more sensors (e.g.,environmental sensors 121,directional sensors 122 and/or proximity sensors 123). For example, in embodiments, sensors may be a temperature sensor, a wind sensor, a humidity sensor, an air quality sensor, and/or an ultraviolet radiation sensor. In embodiments, a shading element orshade 103, and/or ashade support assembly 105 may comprise one or more imaging devices 126 (e.g., cameras). In embodiments, a shading support may not include an audio system (e.g., aspeaker 153 and/or an audio/video transceiver 152) and may not include lighting assemblies. In embodiments, a shading housing 108 may not include one or more lighting assemblies, one or more imaging devices, one or more sensors, and/or one or more integrated computing devices. In embodiments, an AI shading housing 108 may comprise one or more lighting assemblies, one or more imaging devices, one or more sensors, and/or one or more integrated computing devices. - In embodiments, an AI device or computing device housing 108 may comprise an
integrated computing device 120. In embodiments, an AI device or computing device housing 108 may comprise one or more processors/controllers 127, one ormore memory modules 128, one or more microphones (or audio receiving devices) 129, one or more PAN transceivers 130 (e.g., Bluetooth transceivers), one or more wireless transceivers 131 (e.g., WiFi or other 802.11 transceivers), and/or one or more cellular transceivers 132 (e.g., EDGE transceiver, 4G, 3G, CDMA and/or GSM transceivers). In embodiments, theprocessors 127,memory 128,transceivers 130 131 132 and/ormicrophones 129 may be integrated into acomputing device 120, where in other embodiments, a single-board computing device 120 (e.g., Raspberry Pi) may not be utilized andprocessors 127 and/ormemory devices 128 may be installed separately within an AI Device or computing device Housing 108. In embodiments, one ormore memory modules 128 may contain computer-readable instructions 140, the computer-readable instructions 140 being executed by one or more processors/controllers 127 to perform certain functionality. In embodiments, the computer-readable instructions may comprise an artificial intelligence application programming interface (API) 141. In embodiments, anartificial intelligence API 141 may allow communications and/or interfacing between an AI device housing 108 and a third party artificial intelligence (AI) engine housed in a local and/or remote server and/orcomputing device 150. In embodiments, anAI API 141 may comprise or include a voice recognition AI API, which may be able to communicate sound files (e.g., analog or digital sound files) to a third party voicerecognition AI server 150. In embodiments, a voice recognition AI server may be an Amazon Alexa, Echo, Echo Dot and/or a Google Now server or other third party voice recognition AI servers. In embodiments, an AI engine and/or an AI voice recognition (e.g., computer-readable instructions 140 stored in one ormore memories 128 and executed by one ormore processors 128 performing AI functions and/or AI voice recognition functions) may be resident on an AI device housing 108 and a third party AI server and/or voice recognition engine may not be utilized. In embodiments, as discussed previously a computing device housing may contain similar components to an AI device housing as described inFIGS. 1 to 3 . - In embodiments, solar cells and/or solar arrays may be mounted on and/or integrated into a shading element or
shade 103.FIG. 1A illustrates a block diagram of components utilized to provide power in an apparatus including an AI device or computing device and a shading system. In embodiments, solar cells and/or solar arrays or photovoltaic (PV)cells 191 may generate solar energy from a sun and convert the solar energy into electrical energy (e.g., voltage and/or current) or electrical power. In embodiments, electrical energy or electrical power generated by one or more solar cells, solar cell arrays and/orPV cells 191 may charge and/or provide power to one or more rechargeable power sources (e.g., a rechargeable battery) 193 in an AI or computing device housing 108 (although a rechargeable battery may be positioned within or located within ashading support 105 and/or shading element 103). In embodiments, one ormore charging assemblies 192 may receive electrical energy from one or more solar cells orPV cells 191 and transfer the electrical energy or electrical power to arechargeable power source 193 or battery. In embodiments, one or more rechargeable power sources orbatteries 193 in an AI device housing 108 may provide power to components (e.g., transceivers, sensors, processors, and/or microphones, etc. 194) and/or assemblies 195 (e.g., motors or motor assemblies) in an AI device housing 108, ashading support 105 and/orshading element 103. In embodiments, an AI device or computing device housing 108 may also receive power from an AC power source. In embodiments, althoughFIG. 1A shows one or morerechargeable batteries 193 providing power tocomponents 194 or assemblies, but in alternative embodiments, one ormore charging assemblies 192 and/or solar cells, solar arrays and/orPV cells 191 may directly or indirectly provide power (e.g., voltage and/or current) tocomponents 194 and/or assemblies 195). - In embodiments, an AI device or computing device housing 108 may comprise one or more sensors. In embodiments, an AI device housing 108 may comprise one or more
environmental sensors 121, one or moredirectional sensors 122 and/or one ormore proximity sensors 123. Although the one or moreenvironmental sensors 121, one or moredirectional sensors 122 and/or one ormore proximity sensors 123 are illustrated as being located on and/or within the AI device housing 108, the sensors identified above may be located on and/or integrated with ashading support 105 and/or a shade element orshade 103. In environments, one or moreenvironmental sensors 121 may comprise one or more air quality sensors, one or more UV radiation sensors, one or more digital and/or analog barometers, one or more temperature sensors, one or more humidity sensors, one or more light sensors, and/or one more wind speed sensors. In embodiments, one or moredirectional sensors 122 may comprise a digital compass, a compass, a GPS receiver, a gyroscope and/or an accelerometer. - In embodiments, an
environmental sensor 121 may comprise an air quality sensor. In embodiments, an air quality sensor may provide ozone measurements, particulate matter measurements, carbon monoxide measurements, sulfur dioxide measurements and/or nitrous oxide measurements. In embodiments, an air quality sensor may provide allergen measurements. Ozone leads to intelligent readings to tell an individual whether or not to go inside. In embodiments, an air quality sensor may communicate measurements and/or readings from an air quality sensor and may communicate these measurements to an AI Device or computingdevice housing processor 127. In embodiments, aprocessor 127, executing computerreadable instructions 140 stored inmemory 128, may receive air quality sensor measurements, analyze the measurements, store the measurements and/or cause AI device and shading system assemblies and/or components to react to air quality measurements. In embodiments, for example, if an air quality is too low, e.g., as compared to an existing threshold, one ormore processors 127 may communicate commands, instructions and/or signals to anaudio system 153 to alert a user of unsafe conditions by reproducing an audible sound on a speaker. In embodiments, for example, ozone measurements from an air quality sensor may be utilized to determine an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor (displayed visually), and/or wirelessly to an external computing device. - In embodiments, an AI device housing or computing device housing 108 may comprise an ultraviolet (UV) radiation sensor. In embodiments, a UV radiation sensor may provide discrete radiation band measurements, including, but not limited to UVB, radiation, UVA radiation, Infrared lighting, or a combination of any and all of these radiation measurements. In embodiments, a UV radiation sensor may communicate these measurements to a
processor 127. In embodiments, aprocessor 127 and computer-readable instructions 140 executed by theprocessor 127, may analyze received UV radiation measurements. In embodiments, aprocessor 127 and computer-readable instructions 140 executed by theprocessor 127 may utilize UV radiation measurements received to determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via asound system 153 and/or 152 (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. - In embodiments, an
environmental sensor 121 in an AI device or computing device housing may comprise a digital barometer. In embodiments, a digital barometer may provide, measure, and/or display complex atmospheric data more accurately and quickly than prior barometers. Many digital barometers display both current barometric readings and previous 1-, 3-, 6-, and 12-hour readings in a bar chart format, much like a barograph. They also account for other atmospheric readings such as wind and humidity to make accurate weather forecasts. In embodiments, a a digital barometer may capture atmospheric data measurements and communicate these measurements to aprocessor 127. In embodiments, for example, computer-readable instructions 140 executed byprocessor 127 may receive digital barometer measurements (e.g., altitude measurements), analyze and/or process these measurements, and determine necessary movements or actions for components and/or assemblies of an AI device and shading system 100. In embodiments, for example, computer-readable instructions 140 executed byprocessor 127 may receive digital barometer measurements and generate a weather forecast for an area being served by an AI device and shading system 100. - In embodiments, an
environmental sensor 121 may comprise a temperature sensor. In embodiments, a temperature sensor may generate and provide a temperature reading or measurement for an environment where an AI device and shading system 100 is located. In embodiments, a temperature sensor may communicate these measurements to aprocessor 127. In embodiments, computer-readable instructions 140 executed by aprocessor 127 may receive temperature measurements, analyze the temperature measurements, and/or, determine actions that should be provided to components and/or assemblies of an AI device and shading system. In embodiments, for example, computer-readable instructions executed by a processor may determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via asound system 152 or 153 (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. - In embodiments, an environmental sensor may comprise a humidity sensor. In embodiments, a humidity sensor may capture and generate humidity measurements in an environment where an AI device and shading system 100 is located. In embodiments, a humidity sensor may communicate these measurements to a
processor 127. In embodiments, computer-readable instructions 140 executed by a processor may receive humidity measurements, analyze humidity measurements and determine actions that may be taken by components and/or assemblies of an AI device and shading system 100. In embodiments, for example, computer-readable instructions 140 executed by aprocessor 127 may be utilized to determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. In embodiments, computer-readable instructions 140 executable by a processor may receive humidity sensor readings and/or temperature sensor readings and determine that 1) an AI Device or computing device housing should be turned off because the environment is too hot or humid or 2) ashade element 103 should be deployed to provide shade to the AI device or computing device housing. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may generate commands, instructions and/or signals and communicate the same to a shading element control system (e.g., a motor controller, a motor and/or driving system) to deploy ashade element 103. - In embodiments, an
environmental sensor 121 may comprise a wind sensor. In embodiments, a wind speed sensor may capture wind speed and/or wind direction, generate wind speed and/or wind direction measurements at an AI device and shading system. In embodiments, a wind sensor may communicate these measurements to aprocessor 127. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may receive wind speed measurements, analyze and/or process these measurements, and determine necessary actions and/or movements by components and/or assemblies of an AI device and shading system 100. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may communicate commands, signals, and/or instructions to a shading element control system (e.g., a motor controller, a motor and/or driving system) to retract ashade element 103 due to high wind conditions. In embodiments, for example, if a wind speed is higher than a predetermined threshold, computer-readable instructions 140 executable by aprocessor 127 may communicate commands, instructions, and/or signals to one or more motor controllers to cause a shading element be retracted and moved to a rest position. - In embodiments, an AI device or computing device 100 may comprise one or more digital cameras or imaging devices and/or
analog imaging devices 126. In embodiments, one ormore cameras 126 may comprise an optical system and/or an image generation system. In embodiments,image devices 126 may display images and/or videos on a screen immediately after being captured. In embodiments, one ormore image devices 126 may store and/or delete images, sound and/or video from a memory associated with animaging device 126. In embodiments, one ormore imaging devices 126 may capture, record and/or moving videos with or without sound. In embodiments, one ormore imaging devices 126 may also incorporate computer-readable and computer-executable instructions which, which when retrieved from a non-volatile memory, loaded into a memory, and executed by a processor, may crop and/or stitch pictures, and/or potentially perform other image editing on captured images and/or video. For example, image stitching or photo stitching is the process of combining multiple photographic images with overlapping fields of view to produce a segmented panorama and/or high-resolution image. In embodiments, image stitching may be performed through the use of computer software embodied within animaging device 126. In embodiments, animaging device 126 may also internally perform video stitching. In embodiments, other devices, components and/or assemblies ofimaging devices 126 or of an AI device housing 108 may perform image stitching, video stitching, cropping and/or other photo editing. In embodiments, computer-readable instructions 140, may be executable by aprocessor 127 in an AI device housing 108 may perform image stitching, video stitching, cropping and/or other photo editing. - In embodiments, imaging devices 126 (e.g., digital cameras) may capture images of an area around, surrounding, and/or adjacent to AI devices with a shading system 100. In embodiments, an AI device housing 108 may comprise one or more imaging devices 126 (e.g., cameras) mounted thereon or integrated therein. In embodiments, a
shading support 105 and/or ashade element 103 may comprise one or more imaging devices 126 (e.g., cameras). In embodiments, an AI device and shading system with more than oneimaging device 127 may allow image, video and/or sound capture for up to 360 degrees of an area surrounding, around and/or adjacent to an AI device or computing device and shading system 100. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may stich and/or combine images and/or videos captured by one ormore imaging devices 126 to provide a panoramic image of the area. The ability of having multiple imaging devices to allows a benefit of panoramic image capture and not just an area where an imaging device is initially oriented. In embodiments, one ormore imaging devices 126 may have one or more image capture resolutions (e.g., 1 Megapixel (MP), 3 MP, 4 MP, 8 MP, 13 MP and/or 38 MP) that are selectable and/or adjustable. In embodiments, one or more imaging devices may also be located on a top portion of ashading element 103 and/orshading support 105 In embodiments, if animaging device 126 is located on a top portion of an AI device with shading system 100 (e.g., ashading element 103 and/or shading support 105), images, sounds and/or videos may be captured at a higher level than ground level. In addition, an imaging device located on a top portion of an AI device and shading system may capture images, sounds, and/or videos of objects in a sky or just of a horizon or sky. For example, in embodiments, animaging device 126 located on a top portion may capture images of mountains and/or buildings that are in a skyline. This may be beneficial in situations where there is a fire in the mountain or an issue with a building or someone wants to monitor certain aspects of a building (e.g., if certain lights are on). Further, one ormore imaging devices 126 located on a top portion of an AI device with shading system may capture images, sounds, and/or videos of a night time sky (e.g., stars). In addition, one ormore imaging device 126 located on a top portion of an AI device with shading system 100 may capture images, sounds, and/or videos of objects moving and/or flying in the sky and/or horizon. - In embodiments, one or
more imaging devices 126 may be activated by messages, signals, instructions and commands. In embodiments, components and/or assemblies of an AI device and shading system 100 (e.g., aprocessor 127, computer-readable instructions 140 executed by aprocessor 127, and/or a proximity sensor 123) may communicate messages, signals, instructions and/or commands to the one ormore imaging devices 126 to activate, turn on, change modes, turn off, change focus and/or change capture image resolution. In addition, messages, signals, instructions, and/or commands may activate one ormore imaging devices 126 and software stored therein may perform image stitching, video stitching, image editing and/or cropping. In embodiments, aprocessor 127 and/or wireless transceiver 130-132 in an AI device with shading system 100 may communicate messages, signals, instructions and/or commands to activate one or more imaging devices in order to perform functions and/or features described above (which may include security system functions). In embodiments, a computing device, separate from an AI device with shading system 100, may communicate messages, signals, instructions and/or commands to activate one or more imaging devices in order to perform functions and/or features described above. - In embodiments, one or
more imaging devices 126 may communicate captured images, sounds and/or videos to aprocessor 127 of an AI shading device and these images, sounds and/or videos may be stored in one ormore memories 128 of an AI shading device. In embodiments, one ormore imaging devices 126 may communicate captured images, sounds and/or videos to a memory of a remote computing device separate from a processor and/orcontroller 127 in an AI shading device housing 108. In embodiments, for example, one ormore imaging devices 126 may communicate captured images, sounds and/or videos to an external computing device (directly for storage and/or streaming). In embodiments, one ormore imaging devices 126 may communicate captured images, sounds, and/or videos utilizing wired (e.g., utilizing Ethernet, USB, or similar protocols and transceivers) and/or wireless communication protocols (e.g., utilizing 802.11 wireless communication protocols and transceivers). - In embodiments, an AI device or computing device housing 108 may comprise one or more of
imaging devices 126 and an infrared detector. In embodiments, an infrared detector may comprise one or infrared light sources and an infrared sensor. In embodiments, an infrared detector may generate a signal indicating that an object is located within an area being monitored or viewed by an infrared detector. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, one ormore imaging devices 126 may be activated and begin to capture images and/or video, with or without sound, and communicate captured images and/or video, with or without sound, to a separate computing device and/or aprocessor 127. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, a lighting assembly (e.g., LED lights) may also be activated and lights may be directed in an area surrounding an AI device and shading system 100 and/or directly to an area where an object is detected. In embodiments, one ormore imaging devices 126 and/or one or more lighting assemblies may be activated, which results in better images and/or video of an area surrounding an AI device and shading system 100. This is yet another example of how an AI device and shading system provides additional benefits of not only capturing images of its surrounding area but also being utilized as a security device for an environment in which an intelligent shading object is located. - In embodiments, an AI device or computing housing 108 may comprise or
more imaging devices 126 which may be thermal imaging cameras. In embodiments, thermal imaging cameras may include a special lens, an infrared light, and an array of infrared-detector elements. In embodiments, an AI device and shading system 100 may comprise an infrared light, a lens and a phased-array of infrared-detector elements. In embodiments, a thermal imaging camera comprises a special lens may focus on infrared light emitted by all objects within an area surrounding and/or adjacent to an AI device/computing device and shading system 100. In embodiments, a focused light may be scanned by a phased array of infrared-detector elements. In embodiments, one or more detector elements may generate a very detailed temperature pattern, which may be referred to as a thermogram. In embodiments, a detector array may take a short amount of time (e.g., about one-thirtieth of a second) to obtain temperature information to make a thermogram. In embodiments, information may be obtained from a plurality of points in a field of view of a detector array. In embodiments, detector elements from a thermogram may be converted and/or translated into electric impulses and electrical impulses may be sent to a signal-processing unit. In embodiments, a signal-processing unit may be a PCB with a dedicated chip that translates received information (electrical impulses) into thermal images and/or thermal video. In embodiments, a signal-processing unit may communicate thermal images and/or thermal video either to a display (e.g., a display and/or a display on a computing device communicating with an AI device and shading system 100). In embodiments, a signal-processing unit of a thermal imaging camera may communicate thermal images and/or thermal video to a processor for analysis, storage and/or retransmission to an external computing devices. In embodiments, a thermal image may appear as various colors depending on and/or corresponding to an intensity of an infrared image. In embodiments, a thermal imaging camera allows additional benefits of not having to activate a lighting assembly in order to capture images and/or videos of an area surrounding an AI device/computing device and system 100. In addition, by not activating a lighting assembly, an intruder or moving object may not be aware that animaging device 126 may be capturing an image or video of an area where an intruder or object is located. In embodiments, an infrared detector may activate a thermal imaging device upon detection of movement. In embodiments, a thermal imaging device may activate on its own due to movement of an intruder and/or object, or may be periodically or continuing capturing images and/or video. - In embodiments, an AI device or computing device and shading system 100 may comprise a
proximity sensor 123. In embodiments, aproximity sensor 123 may be able to detect a presence of nearby objects, (e.g., people or other physical objects) without any physical contact between a sensor and an object. In embodiments, aproximity sensor 123 be located on and/or mounted on an AI device housing 108. In embodiments, aproximity sensor 123 may be located on and/or mounted on other printed circuit boards or may be a standalone component. In embodiments, aproximity sensor 123 may be located within and/or mounted on ashading support 105 and/or ashading element 103. In embodiments, aproximity sensor 123 may generate measurements and/or signals, which may be communicated to a processor/controller 127. In embodiments, computer-readable instructions 140, which are fetched frommemory 128 and executed by aprocessor 127, may perform and/or execute a proximity process or method. In embodiments, for example, a proximity process may comprise receiving measurements and/or signals from aproximity sensor 123 indicating an object and/or person may be located in an area where an AI device and shading system is deployed, going to be deployed and/or extended, and/or towards where a component of an AI device and shading system 100 may be moving. For example, if an individual is located in an area where ashading support 105 may be deployed and/or extended, aproximity sensor 123 may transmit a signal or measurement indicating an object may be an obstruction to movement of ashading support 105. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may receive and/or analyze a proximity measurement and determine an object may be an obstacle. In embodiments, a proximity signal and/or command may also identify a location of an object (e.g., obstacle) in relation to aproximity sensor 123 and/or some reference location. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may generate and/or communicate a driving signal, command, and/or instruction that instructs an AI device and shading system 100 not to deploy and/or open. In embodiments, this may also work in the opposite direction, where if aproximity sensor 123 does not determine that an object is within an AI device and shading system area, then a proximity sensor signal may not be communicated to the processor/controller 127. - In embodiments, a
proximity sensor 127 may identify location of a person relative to moving components of an AI device or computing device and shading system 100. Utilization ofproximity sensors 127 on AI devices and shading system provides an advantage over AI devices due to detection of objects, individuals, animals and/or other devices. For example, based on proximity sensor measurements, detections and/or values, an AI device and shading system 100 may move a position of one or more assemblies or modules (e.g., shading support, shading element, and/or other components) to prevent problematic conditions or situations where objects and/or individuals may damage components and/or assemblies of an AI device and shading system. For example, based onproximity sensor 127 measurements or values, a shading element or shading support may be retracted. - In embodiments,
proximity sensors 123 may comprise one or more laser sensors, light sensors, line of sight sensors, ultrasound or ultrasonic sensors, infrared or other light spectrum sensors, radiofrequency sensors, time of flight sensors, and/or capacitive sensors. In embodiments, aproximity sensor 123 may emit an electromagnetic field or a beam of electromagnetic radiation (infrared, for instance), and may measure changes in a field surrounding an object or measure changes in a return signal. In embodiments, a laser sensor may comprise through-beam sensors, retro-reflective sensors and/or diffuse reflection sensors. In embodiments, a laser light returned may be measured against an original signal to determine if an object and/or person is present. In embodiments, laser light may consist of light waves of the same wave length with a fixed phase ratio (coherence), which results in laser systems having almost parallel light beam. Thus, movements may be detected via small angles of divergence in returned laser light. In embodiments, a light or photoelectric sensor may be utilized as aproximity sensor 123 and may transmit one or more light beams and may detect if any return reflected light signals are present. In embodiments, a photoelectric sensor may be a diffusion and/or retro-reflective and/or diffusion sensor. In embodiments, diffusion sensor emitters and receivers may be located in a same housing. In embodiments, a target may act as a reflector, so that detection may occur if light s reflected off a disturbance object. In embodiments, an emitter sends out a beam of light (most often a pulsed infrared, visible red, or laser) that diffuses in all directions, filling a detection area. In embodiments, a target may enter an area and may deflects part of a beam back to a receiver. In embodiments, a photoelectric sensor may detect a target and an output signal may be turned on or off (depending upon whether a photoelectric sensor is light-on or dark-on) when sufficient light falls on a receiver of a photoelectric sensor. - In embodiments, a
proximity sensor 123 may be an inductive sensor which may detect movements in metallic and/or ferrous objects. In embodiments, inductive sensors may detect ferrous targets, for example, a metal (e.g., steel) thicker than one millimeter. In embodiments, aproximity sensor 123 may be a capacitive sensor. In embodiments, a capacitive sensor may detect both metallic and/or non-metallic targets in powder, granulate, liquid, and solid form. In embodiments, aproximity sensor 123 may be an ultrasonic sensor. In embodiments, an ultrasonic diffuse proximity sensor may employ a sonic transducer, which emits a series of sonic pulses, then listens for their return from a reflecting target. In embodiments, once a reflected signal is received, sensor signals may be output to a control device. In embodiments, an ultrasonic sensor may emit a series of sonic pulses that bounce off fixed, opposing reflectors, which may be any flat surface. In embodiments, sound waves may return to a sensor within a user-adjusted time interval and if sound waves do not, an object may be obstructing a ultrasonic sensing path and an ultrasonic sensor may output signals accordingly. I embodiments, aproximity sensor 123 may be a time of flight sensor. In embodiments, time of flight optical sensors may determine displacement and distance by measuring a time it takes a light to travel from an object (intelligent shading system) to a target and back. In embodiments, a time of flight sensor may be a time of flight camera, which is a range imaging camera. In embodiments, a time-of-flight camera (ToF camera) may resolves distance based on speed of light, by measuring a time-of-flight of a light signal between a camera and a subject and/or target for each point of an image. - In embodiments, an AI device or computing device housing 108 may comprise one or more
directional sensors 122. In embodiments, adirectional sensor 122 may also comprise a GPS transceiver, a compass, a magnetometer, a gyroscope and an accelerometer. In embodiments, ashading support 105 and/or ashading element 103 may comprise one or more directional sensors (e.g., GPS transceiver, a compass, a gyroscope and an accelerometer). In embodiments, directional sensors may provide orientations and/or locations of an AI device and shading system 100 as well as different components of an AI device and shading system 100. In embodiments, computer-readable instructions 140 executable by aprocessor 127 may request an initial desired orientation for different assemblies and/or components of an AI device and shading system and communicate such directional request to one or moredirectional sensors 122. In embodiments, one or more gyroscopes may be utilized to determine, calculate and/or detect an angle of asupport assembly 105 with respect to an AI device housing 108 and/or detect an angle of asupport assembly 105 with respect to an shading element 103 (e.g., determine a current elevation of different assemblies of an AI device and shading system 100). In embodiments, one or more accelerometers may also be utilized along with one or more gyroscopes to determine, calculate and/or detect angles discussed above. - In embodiments, computer-
readable instructions 140 executed by aprocessor 127 may communicate a directional request to one or moredirectional sensors 122. In embodiments, one or more directional sensors 122 (e.g., compass and/or magnetometer) may determine movement and/or a relative position of an AI device with shading system 100 (or other components or assemblies) with respect from a reference direction. In embodiments, for example, a directional measuring sensor 122 (e.g., compass, digital compass and/or magnetometer) may determine relative movement and/or a relative position with respect to true north. In embodiments, for example, a compass and/or a digital compass may determine movement and/or a relative position with respect to true north. In embodiments, these measurements may be referred to as heading measurements. In embodiments, adirectional measuring sensor 122 may communicate and/or transfer heading measurements to aprocessor 127, where these heading measurements may be stored in amemory 128. - In embodiments, in response to a directional orientation request by computer-
readable instructions 140 executed by aprocessor 127, a GPS transceiver may measure a geographic location of an AI device and shading system 100 (and associated assemblies) and may communicate such geographic location measurement to aprocessor 127, which may transfer these heading measurements into amemory 128. In embodiments, a GPS transceiver may determine latitude and/or longitude coordinates and communicate such latitude and/or longitude coordinates to aprocessor 127. In embodiments, a clock may capture a time of day and communicate and/or transfer such time measurement to a processor, which may store the time measurement in amemory 128. - In embodiments, computer-
readable instructions 140 executed by aprocessor 127 stored in amemory 128 may include algorithms and/or processes for determining and/or calculating a desired azimuth and/or orientation of an AI device and shading system (and associated assemblies) depending on a time of day. In an alternative embodiment, a portable computing device executing computer-readable instructions on a processor (e.g., a SMARTSHADE software app) and located in a vicinity of an AI device and shading housing 100 may retrieve coordinates utilizing a mobile computing device's GPS transceiver and may retrieve a time from a mobile computing device's processor clock and provide these geographic location measurements and/or time to aprocessor 127 in an AI shading housing 108. - In embodiments, computer-
readable instructions 140 stored in amemory 128 may be executed byprocessor 127 and may calculate a desired AI device and shading system 100 (and associated assemblies such asshading support 105 and/or shading element 103) angle and/or azimuth angle utilizing received geographic location measurements, heading measurements, and/or time measurements. In embodiments, computer-readable instructions 140 stored in amemory 128 may compare 360 desired elevation angle measurements and azimuth angle measurements to a current elevation angle and azimuth angle of the AI device and shading system 100 (and associated assemblies such asshading support 105 and/or shading element 103) (calculated from gyroscope measurements, accelerometer measurements, and/or both) to determine movements that ashading support 105 and/orshading element 103 may make in order to move to a desired orientation. In embodiments, executed computer-readable instructions may calculate an azimuth adjustment measurement to provide to an azimuth motor and/or an elevation adjustment measurement to provide to a motor assembly. - In embodiments, an AI device or computing device housing 108 may comprise one or
more microphones 129 to capture audio, and/or audible or voice commands spoken by users and/or operators of shading systems 100. In embodiments, computer-readable instructions 140 executed by one ormore processors 127 may receive captured sounds and create analog and/or digital audio files corresponding to spoken audio commands (e.g., open shading system, rotate shading system, elevate shading system, select music to play on shading system, turn one lighting assemblies). In embodiments, anAI API 141 may communicate such generated audio files to anexternal AI server 150. In embodiments, for example, anAI API 141 in an AI shading device housing 108 may communicate generated audio files toexternal AI servers 150 via and/or utilizing one ormore PAN transceivers 130, one or more wireless localrea network transceivers 131, and/or one or morecellular transceivers 132. In other words, communications with anexternal AI server 150 may occur utilizing PAN transceivers 130 (and protocols). Alternatively, or in combination with, communications with anexternal AI server 150 may occur utilizing a local area network (802.11 or WiFi)transceiver 131. Alternatively, or in combination with, communications with anexternal AI server 150 may occur utilizing a cellular transceiver 132 (e.g., utilizing 3G and/or 4G or other cellular communication protocols). In embodiments, an AI shading device housing 108 may utilize or comprise more than onemicrophone 129 to allow capture of voice commands from a number of locations and/or orientations with respect to an AI device and shading system 100 (e.g., in front of, behind an AI device and shading system, and/or at a 45 degree angle with respect to a support assembly 105). - In embodiments, a
mobile computing device 110 may communicate with an AI Device or computing device and shading system 100. In embodiments, a user and/or operator may communicate with a mobile computing orcommunications device 110 by a spoken command into a microphone of amobile computing device 110. In embodiments, a mobile computing orcommunications device 110 communicates a digital or analog audio file to aprocessor 127 and/orAI API 141 in an AI shading device housing 108 (e.g., utilizing one or more of transceivers (e.g.,PAN transceiver 130; wireless orWiFi transceiver 131 and/or cellular transceiver 132). In embodiments, a mobile computing orcommunications device 110 may also convert the audio file into a textual file for easier conversion by either anAI API 141 or an AI engine in a an external AI server orcomputing device 150. In embodiments, an AI engine may also be resident within one ormore memories 128 of an AI shading device housing 108 (e.g., computer-readable instructions 140 executed by a processor 127) -
FIG. 1 describes an AI device or computing device and shading system 100 having a shading element orshade 103,shading support 105 and/or an AI shading device housing 108. An AI shading device housing 108 such as the one described above may be attached to any shading system and may provide artificial intelligence functionality and services for such shading systems. In embodiments, a shading system may be an autonomous and/or automated shading system having an integrated computing device, sensors and other components and/or assemblies, but may benefit from having and may have artificial intelligence functionality and services provided utilizing an AI API and/or an AI engine stored in a memory of an AI device or computing device housing. - In embodiments, an AI device or computing device housing may comprise an
audio transceiver 153 and/or a sound reproduction device 152 (e.g., speaker). In embodiments, audio files (e.g., digital and/or analog digital files) may be communicated to anaudio transceiver 153 and further to asound reproduction device 152 for audible reproduction. Thus, communications from an AI engine (e.g., feedback commands and/or instructions) may be communicated to atransceiver 153 and/or speaker for audible feedback. In embodiments, music and/or audio files communicated from an external server and/or from local memory may be communicated to anaudio transceiver 153 and/orspeaker 152 for reproduction to a user and/or operator. -
FIG. 5 illustrates a block and dataflow diagram of communications between an AI device or computing device and shading system according to embodiments. An AI Device or computing device and shading system 500 may communicate with anexternal AI server 575 and/oradditional content servers 580 via wireless and/or wired communications networks. In embodiments, a user may speak 591 a command (e.g., turn on lights, or rotate shading system) which is captured as an audio file and received at an AI device or computing device andshading system 570. In embodiments, an AI API 541 in an AI device andshading system 570 may communicate and/or transfer 592 an audio file (utilizing a transceiver—PAN, WiFi/802.11, or cellular) to an external or third-party AI server 575. In embodiments, anexternal AI server 575 may comprise a voice recognition engine ormodule 585, acommand engine module 586, a thirdparty content interface 587 and/or thirdparty content formatter 588. In embodiments, anexternal AI server 575 may receive 592 one or more audio files and a voice recognition engine ormodule 585 may convert received audio file to a device command (e.g., shading system commands, computing device commands) and communicate 593 device commands to a command engine module orengine 586. In embodiments, if a voice command is for operation of an AI device andshading system 570, a command engine ormodule 586 may communicate and/or transfer 594 a generated command, message, and/or instruction to an AI device andshading system 570. In embodiments, an AI device andshading system 570 may receive the communicated command, communicate and/or transfer 595 the communicated command to a controller/processor 571. In embodiments, the controller/processor 571 may generate 596 a command, message, signal and/or instruction to cause an assembly, component, system ordevices 572 to perform an action requested in the original voice command (open or close shade element, turn on camera and/or sensors, activate solar panels). - In embodiments, a user may request actions to be performed utilizing a AI device or computing device and shading system's microphones and/or transceivers that may require interfacing with third party content servers (e.g., NEST, e-commerce site selling sun care products, e-commerce site selling parts of AI devices and shading systems, communicating with online digital music stores (e.g., iTunes), home security servers, weather servers and/or traffic servers). For example, in embodiments, an AI device or computing device and shading system user may request 1) traffic conditions from a third party traffic server; 2) playing of a playlist from a user's digital music store accounts; 3) ordering a replacement skin and/or spokes/blades arms for a shading system. In these embodiments, additional elements and steps may be added to previously described method and/or process.
- For example, in embodiments, a user may speak 591 a command or desired action (execute playlist, order replacement spokes/blades, and/or obtain traffic conditions from a traffic server) which is captured as an audio file and received at an AI API 541 stored in one or more memories of an AI device or
computing device housing 570. As discussed above, in embodiments, an AI API 541 may communicate and/or transfer 592 an audio file utilizing a shading system's transceiver to anexternal AI server 575. In embodiments, anexternal AI server 575 may receive one or more audio files and a voice recognition engine ormodule 585 may convert 593 received audio file to a query request (e.g., traffic condition request, e-commerce order, retrieve and stream digital music playlist). - In embodiments, an
external AI server 575 may communicate and/or transfer 597 a query request to a third party server (e.g., traffic conditions server (e.g., SIGALERT or Maze), an e-commerce server (e.g., a RITE-AID or SHADECRAFT SERVER, or Apple iTunes SERVER) to obtain third party goods and/or services. In embodiments, a third party content server 580 (a communication and query engine or module 581) may retrieve 598 services from adatabase 582. In embodiments, a thirdparty content server 580 may communicate services queried by the user (e.g., traffic conditions or digital music files to be streamed) 599 to anexternal AI server 575. In embodiments, a thirdparty content server 580 may order requested goods for a user and then retrieve and communicate 599 a transaction status to anexternal AI server 575. In embodiments, acontent communication module 587 may receive communicated services (e.g., traffic conditions or streamed digital music files) or transaction status updates (e.g., e-commerce receipts) and may communicate 601 the requested services (e.g., traffic conditions or streamed digital music files) or the transaction status updates to an AI device or computing device andshading system 570. Traffic services may be converted to an audio signal, and an audio signal may be reproduced utilizing an audio system 583. Digital music files may be communicated and/or streamed directed to an audio system 583 because there is no conversion necessary. E-commerce receipts may be converted and communicated to speaker 583 for reading aloud. E-commerce receipts may also be transferred to computing device in an AI device or computing device andshading system 570 for storage and utilization later. - In embodiments, computer-readable instructions in a memory module of a an AI device or computing device and
shading system 570 may be executed by a processor and may comprise a voice recognition module orengine 542 and in this embodiment, voice recognition may be performed at an AI device or computing device andshading system 570 without utilizing a cloud-based server. In embodiments, an AI device andshading system 570 may receive 603 the communicated command, communicate and/or transfer 604 the communicated command to a controller/processor 571. In embodiments, the controller/processor 571 may generate and/or communicate 596 a command, message, signal and/or instruction to cause an assembly, component, system ordevice 572 to perform an action requested in the original voice command. -
FIG. 2 illustrates an apparatus including an AI device or computing device and shading system with an adjustable shading supports according to embodiments. In embodiments, an AI andshading system 200 comprises a shading element or a plurality ofshading elements 203, one or more shading supports 205 and/or anAI device housing 208. In embodiments, an AI device orcomputing device housing 208 may comprise anupper body 212 and abase assembly 211. In embodiments, anAI device housing 208 may comprise a microphone and/or LED array 215. In embodiments, an AI device orcomputing device housing 208 may comprise one ormore processors 227, one ormore PAN transceivers 230, one or more WiFi or 802.11transceivers 231, and/or one or more cellular transceivers 232 (the operations of which are described above with respect toFIG. 1 ). In addition, an AI device housing 208 (and/or an AI device and shading system 200) may also include sensors (similar todirectional sensors 122,environmental sensors 121 and/orproximity sensors 123 ofFIG. 1 ), an audio receiver and speaker, a computing device although these components and/or assemblies are not shown or illustrated inFIG. 2 . - In embodiments, an
AI device housing 208 may comprise one or more audio transceivers 243 and one ormore speakers 229. In embodiments, audio files, music files, and/or voice files may be communicated to one or more audio transceivers 243 and/or one ormore speakers 229 for audio playback. In embodiments, one ormore speakers 229 may be a speaker line array where speakers are located at least on each side of an AI device housing to provide sound coverage on each of an AI device orcomputing device housing 208 according to embodiments. - In embodiments, a microphone and/or LED array 215 may provide sound capture and/or lighting on each side or a number of sides of an AI device housing. In embodiments, as is illustrated in
FIG. 2 , a microphone and/or LED array may be positioned above abase assembly 211 of an AI device housing.FIG. 4 illustrates a microphone and/or LED array in an AI device housing or computing device housing according to embodiments. In embodiments, a microphone and/orLED array 400 may comprise aplastic housing 405, one or more flexible printed circuit boards (PCBs) orcircuit assemblies 410, one or more LEDs orLED arrays 415 and/or one or more microphones and/ormicrophone arrays 420. In embodiments, aplastic housing 405 may be oval or circular in shape. In embodiments, aplastic housing 405 may be fitted around a shaft, a post and/or tube in anAI device housing 208. In embodiments, aplastic housing 405 may be adhered to, connected to and/or fastened to a shaft, a post and/or tube. In embodiments, a flexible PCB orhousing 410 may be utilized to mount and/or connect electrical components and/or assemblies such asLEDs 415 and/ormicrophones 420. In embodiments, a flexible PCB orhousing 410 may be mounted, adhered or connected to a plastic housing orring 405. In embodiments, a flexible PCB orhousing 410 may be mounted, adhered or connected to an outer surface of a plastic housing orring 405. In embodiments, a plastic housing orring 405 may have one or morewaterproof openings 425 for venting heat from one ormore microphone arrays 420 and/or one ormore LED arrays 415. In embodiments, a plastic housing orring 405 may have one or more waterproof openings for keeping water away and/or protecting one ormore microphone arrays 420 and/or one ormore LED arrays 415 from moisture and/or water. In embodiments, one orLED arrays 415 may be mounted and/or connected on an outer surface of aflexible PCB strip 410 and may be positioned at various locations on theflexible PCB 410 to provide lighting in areas surrounding a shading and AI system. In embodiments, one or more LED arrays may be spaced at uniform distances around a plastic housing 405 (e.g., or ring housing). In embodiments, one or more microphones ormicrophone arrays 420 may be mounted and/or connected to aflexible PCB strip 410. In embodiments, one or more microphones ormicrophone arrays 420 may be positioned at one or more locations around a housing orring 405 to be able capture audible sound and/or voice commands coming from a variety of directions. In embodiments, one or more microphones ormicrophone arrays 420 may be spaced at set and/or uniform distances around a housing and/orring 405. - Referring back to
FIG. 2 , in embodiments, abase assembly 211 may be stationary and an AI device or computing device housing orbody 212 may rotate about abase assembly 211.FIG. 6 illustrates a block diagram of components and assemblies for rotating an AI device housing or computing device housing body about a base assembly. In embodiments, as illustrated inFIG. 6 , abase assembly 211 may comprise amotor 610, amotor controller 611, a shaft or drivingassembly 612 and/or agearing assembly 613. In embodiments, an AI device or computingdevice housing body 212 may comprise agearing assembly 614 and/or aconnector 615. In embodiments, in response to a command and/or instruction being received by amotor controller 611, amotor controller 611 may communicate a command and/or signal to amotor 610. In response, amotor 610 may be activated and may cause rotation of a shaft or drivingassembly 612, which is connected to agearing assembly 613. In embodiments, rotation of a shaft or drivingassembly 612 may cause rotation of agearing assembly 613. In embodiments, agearing assembly 613 in abase assembly 211 may cause rotation of agearing assembly 614 in an AIdevice housing body 212, which is connected and/or couple to a connector orplate 615 in an AI device orcomputing device housing 208. In embodiments, rotation of agearing assembly 614 and/or a connector orplate 615 may cause rotation of the AI device orcomputing device housing 208 about abase assembly 211. This provides an advantage over other prior art devices because the AI device orcomputing device housing 208 may move to follow and/or track a sun and thus the shading element orshade 203 may be able to provide protection from the sun and/or heat by moving and/or tracking the sun. AlthoughFIG. 6 illustrates that amotor controller 611, amotor 610, a driving assembly orshaft 612 and/or agearing assembly 613 in abase assembly 211 and agearing assembly 614 and/or a connector orplate 615 in aAI device body 212, any of the components may be placed in or be resident in the other assembly (e.g., different components (e.g., gearingassembly 614 and/or a connector or plate 615) may be placed and/or positioned in abase assembly 211 and other components (e.g.,motor controller 611, amotor 610, a driving assembly orshaft 612 and/or a gearing assembly 613) may be placed and/or positioned in aAI device body 212. In either configuration, an AI device orcomputing device body 212 may rotate about abase assembly 211, and this may provide additional flexibility in providing protection from the sun and other environmental conditions for theAI device body 212. In embodiments, the description above and components and assemblies utilized inFIGS. 2 and 6 which allow rotation of abase assembly 211 with respect to anAI device housing 208 may also be utilized in a rotation assembly inFIG. 1 (rotation of a support assembly orshading support 105 to an AI or computing device housing 108). In addition, the rotation of abase assembly 211 with respect to an AI device orcomputing device housing 208 may also be utilized in the AI device housings 108 (inFIG. 1 ) and 308 (inFIG. 3 ). In other words, each of the devices described inFIGS. 1 and 3 may have the ability to rotate about a base assembly. - Referring back to
FIG. 2 , ashading support 205 may comprise one or more support arms. For example, as illustrated inFIG. 2 , two support arms may be utilized to connect a shading element orshade 203 to an AI device or computin device housing 208 (although one, three, four, five or six support arms may also be utilized). In embodiments, a motor assembly may cause one or supportarms 205 to move to different positions to protect an AI device orcomputing device housing 208 from heat, sun, rain, hail, snow and/or other environmental elements. In embodiments, movement or one or more shading supports (or support arms) may tilt a shade element orshade 203 towards a sun, such as illustrated byreference number 291 inFIG. 2 . In embodiments, a motor controller may receive commands, instructions, messages or signals requesting movement of a shading element orshade 203 and may generate commands and/or signals to cause a motor to turn, a shaft to rotate, and/or a gearing assembly to turn. In embodiments, a gearing assembly may be attached to ashading support 203 and may cause movement of one or more shading supports 205 which in turn moves and/or rotates a shading element orshade 203. In embodiments, a shading element orshade 203 may be expandable. In embodiments, a shading element orshade 203 may one length and/or width in one position (e.g., a rest position) and may expand and have a larger length and/or width in other positions (e.g., when deployed and protecting anAI device housing 208 from a weather or other environmental conditions). This may be referred to as expanding shade. -
FIG. 3 illustrates an apparatus including AI device or computing device and shading system with a hinging support assembly according to embodiments. The operation of components (transceivers, cameras, sensors, processors, and/or computer-readable instructions in AI device housing 308 is similar to that as described above with respect toFIGS. 1 and 2 .FIG. 3 's AI device/computing device andshading system 300 comprises a twohinge shading support 305. In embodiments, a twohinge shading support 305 may comprise afirst shading support 391, a hingingassembly 392 and asecond shading support 393. In embodiments, afirst shading support 391 may rotate with respect to an AI device housing 308 as is illustrated by reference number 394 (and thus the shading element orshade 303, the hingingassembly 392 and thesecond shading support 393 may also rotate with respect to the AI device housing). In embodiments, asecond shading support 392 may rotate about a first shading support 319 utilizing ahinging assembly 392. In embodiments, a rotation of asecond shading support 392 about a first shading support using ahinging assembly 392 is illustrated byreference number 395. In embodiments, the description above and the components and assemblies described therein may be utilized in thehinging assembly 252 illustrated inFIG. 2 . In other words, similar components may be utilized in thehinging assembly 252 ofFIG. 2 . - In embodiments, a first motor assembly comprises a first motor shaft that may rotate in response to activation and/or utilization of a first motor. In embodiments, a first motor shaft may be mechanically coupled (e.g., a gearing system, a friction-based system, etc.) to a force transfer shaft. In embodiments, a first motor shaft may rotate in a clockwise and/or counterclockwise direction and in response, a force transfer shaft may rotate in a same and/or opposite direction. In embodiments, a force transfer shaft may pass may be mechanically coupled to a receptacle in an AI device housing. In response to, or due to, rotation of force transfer shaft in a receptacle in an AI device or computing device housing 308, a first support assembly 391 (and thus a shade element or
shade 303 plus a hingingassembly 392 and a second support assembly 393) may rotate with respect to the AI device or computing device housing 308. In embodiments, a first motor may be coupled to a gearbox assembly. In embodiments, a gearbox assembly may comprise a planetary gearbox assembly. A planetary gearbox assembly may be comprise a central sun gear, a planet carrier with one or more planet gears and an annulus (or outer ring). In embodiments, planet gears may mesh with a sun gear while outer rings teeth may mesh with planet gears. In embodiments, a planetary gearbox assembly may comprise a sun gear as an input, an annulus as an output and a planet carrier (one or more planet gears) remaining stationary. In embodiments, an input shaft may rotate a sun gear, planet gears may rotate on their own axes, and may simultaneously apply a torque to a rotating planet carrier that applies torque to an output shaft (which in this case is the annulus). In embodiments, a planetary gearbox assembly and a first motor may be connected and/or adhered to afirst support assembly 391 although resident within the AI device housing. In embodiments, a motor and gearbox assembly may be resident within an AI device orcomputing device housing 208. In embodiments, an output shaft from a gearbox assembly may be connected to an AI device housing (e.g., an opening of an AI) and/or afirst support assembly 391. In embodiments, because an AI device or computing device housing 308 is stationary, torque on an output shaft of a gearbox assembly may be initiated by a first motor to cause a first support assembly 391 (and thus a shade element or shade 303) to rotate. In embodiments, other gearbox assemblies and/or hinging assemblies may also be utilized to utilize an output of a motor to cause a first support assembly 391 (and hence a shade element or shade 303) to rotate with respect to an AI or computing device housing 308. In embodiments, a first motor may comprise a pneumatic motor, a servo motor and/or a stepper motor. Although, the rotation of a support assembly with respect to a computing or AI device housing is described above with respect toFIG. 3 , similar or the same components and assemblies (e.g., gearbox assemblies described above) may be present in the device ofFIG. 1 to allow rotation of asupport assembly 105 with respect to an AI or computing device housing 108. - In embodiments, a
first support assembly 391 may be coupled and/or connected to asecond support assembly 393 via a hinging assembly. In embodiments, ashading support 305 may comprise afirst support assembly 391, a second gearbox assembly (or a linear actuator or hinging assembly) 392, asecond support assembly 393, a second motor, and/or a second motor controller. In embodiments, a second motor assembly may comprise a second motor controller and a second motor, and maybe a second gearbox assembly or linear actuator. In embodiments, ashading support 305 may also comprise a motor control which may have a second motor controller mounted and/or installed thereon. In embodiments, asecond support assembly 393 may be coupled or connected to afirst support assembly 391 via a hinging assembly 392 (e.g., a second gearbox assembly). In embodiments, a second gearbox assembly and a second motor connected thereto, may be connected to afirst support assembly 391. In embodiments, an output shaft of a second gearbox assembly may be connected to asecond support assembly 393. In embodiments, as a second motor operates and/or rotates, a second gearbox assembly rotates an output shaft which causes asecond support assembly 393 to rotate (either upwards or downwards) at a right angle from, or with respect to, afirst support assembly 391. In embodiments utilizing a linear actuator as ahinging assembly 392, a steel rod may be coupled to asecond support assembly 393 and/or afirst support assembly 391 which causes a free hinging between asecond support assembly 391 and afirst support assembly 391. In embodiments, a linear actuator may be coupled, connected, and/or attached to asecond support assembly 393 and/or afirst support assembly 391. In embodiments, as a second motor operates and/or rotates a steel rod, asecond support assembly 393 moves in an upward or downward direction with respect to a hinged connection (or hinging assembly) 392. - In embodiments, a
first support assembly 391 may comprise an elevation motor, an elevation motor shaft, a worm gear, and/or a speed reducing gear. In embodiments, a speed reducing gear may be connected with a connector to a connection plate. In embodiments, afirst support assembly 391 may be mechanically coupled to asecond support assembly 393 via a connection plate. In embodiments, a connection plate may be connected to asecond support assembly 393 via a connector and/or fastener. In embodiments, an elevation motor may cause rotation (e.g., clockwise or counterclockwise) of an elevation motor shaft, which may be mechanically coupled to a worm gear. In embodiments, rotation of an elevation motor shaft may cause rotation (e.g., clockwise or counterclockwise) of a worm gear. In embodiments, a worm gear may be mechanically coupled to a speed reducing gear. In embodiments, rotation of a worm gear may cause rotation of a speed reducing gear via engagement of channels of a worm gear with teeth of a speed reducing gear. In embodiments, a sped reducing gear may be mechanically coupled to a connection plate to a second support assembly via a fastener or connector. In embodiments, rotation of a speed reducing gear may cause a connection plate (and/or a second support assembly 393) to rotate with respect to afirst support assembly 391 in a clockwise or counterclockwise direction as is illustrated byreference number 395. In embodiments, asecond support assembly 393 may rotate with respect to afirst support assembly 391 approximately 90 degrees via movement of the connection plate. In embodiments, asecond support assembly 393 may rotate approximately 0 to 30 degrees with respect to afirst support assembly 391 via movement of the connection plate. -
FIG. 7A illustrates an AI Device with Shading System with a movable base assembly according to embodiments. In embodiments, an AI device and shading systemintelligent shading system 700 may comprise amovable base assembly 711, an AI device or computing device housing 708, a support assembly 730 and/or a shading element or shade 740. In embodiments, amovable base assembly 711 may be integrated as part of an AI device or computing device housing 708. In embodiments, as described inFIGS. 2 and 6 , an AI device or computing device housing 708 may rotate about amovable base assembly 711. In embodiments, amovable base assembly 711 may comprise abase motor controller 715, abase motor 716, adrive assembly 717 and/or one or more wheels (or base driving assemblies) 718. In embodiments, a base assembly ormovable base assembly 710 may comprise one or moreenvironmental sensors 721 and/or one or moredirectional sensors 722. In embodiments, abase assembly 710 may also comprise one ormore proximity sensors 719. In embodiments, a base assembly or movable 710 may comprise one or more processor orcontrollers 711, one or more memory modules ormemories 712 and/or computerreadable instructions 713, where the computer-readable instructions are fetched, read and/or accessed from the one or more memory modules ormemories 712 and executed by the one or more processor orcontrollers 711 to perform a number of functions. In embodiments, a base assembly ormovable base assembly 710 may comprise one or moreseparate wireless transceivers 714. In embodiments, a base assembly ormovable base assembly 710 may comprise one ormore cameras 726. In embodiments, the one ormore cameras 726, one or morewireless transceivers 714, one or more memory modules ormemories 712, one ormore proximity sensors 719, one ormore direction sensors 722 and/or one or moreenvironmental sensors 721 may be in addition to similar or same devices located on the AI device housing 708, thesupport assembly 205 and/or shade orshading element 203. In embodiments, operation and/or utilization of these sensors and/or devices are similar to that described with respect toFIGS. 1, 2 and 5 . - In embodiments, a base assembly or
movable base assembly 710 may move around a surface (e.g., a ground surface, a floor, a patio, a deck, and/or outdoor surface) based at least part on environmental conditions. In embodiments, a base assembly ormovable base assembly 710 may move based on pre-programmed settings or instructions stored in one ormore memories 712 of abase assembly 710 ormemory 228 of an AI device orcomputing device housing 208. In embodiments, a base assembly ormovable base assembly 710 may move based on pre-programmed settings or instructions stored in one ormore memories 228 of an AI device or computing device housing 708 and/or one ormore memories 712 of abase assembly 710. In embodiments, abase assembly 710 may move around a surface in response to commands, instructions, messages or signals communicated from portable computing devices (e.g., mobile phone, smart phone, laptops, mobile communication devices, mobile computing devices and/or tablets). In embodiments, a base assembly or amovable base assembly 710 may move around a surface in response to voice commands. In embodiments, for example, a base assembly ormovable base assembly 710 may move to track environmental conditions (e.g., the sun, wind conditions, humidity conditions, temperature conditions) and/or may move in response to an individual's commands. In embodiments, a base assembly ormovable base assembly 710 may move around a surface based at least in part (or in response to) sensor readings. In embodiments, abase assembly 710 may move around a surface based at least in part on images captured and received by cameras located on abase assembly 710, ashading system 700, and/or a portable computing device and/or a server (or computing device) 729. - In embodiments, computer-
readable instructions 713 stored in one ormore memories 712 of a base assembly ormovable base assembly 710 may be executed by one ormore processors 711 and may cause movement of the base assembly based on or according to pre-specified conditions and/or pre-programmed instructions. In embodiments, for example, abase assembly 710 of an AI Device andShading System 700 may move to specified coordinates at a specific time based on the stored computer-readable instructions 713 stored in one ormore memories 712. For example, abase assembly 710 may move 10 feet to the east and 15 feet to the north at 8:00 am based on stored computer-readable instructions 713. Similarly, for example, abase assembly 710 may move to specified coordinates or locations at a specific time based on computer-readable instructions 240 stored in one ormore memories 228 of an AI Device or computing device housing 708. - In embodiments, for example, a
base assembly 710 may move to specified coordinates and/or location based upon other conditions (e.g., specific days, temperatures, humidity, latitude and longitude, and other devices being in proximity) that may match conditions or be predicted on conditions stored in the computer-readable instructions 713 stored in the one ormore memories 712 of a base assembly. For example, abase assembly 710 may move if it is 9:00 pm and/or if it is a Saturday. Similarly, the computer-readable instructions 240 may be stored in one ormore memories 228 of an AI Device or computing device housing and instructions, commands and/or messages may be communicated to amotor controller 715 in amovable base assembly 710. - In embodiments, a motor controller and/or a
processor 227 in a AI device or computing device housing 708 may communicate instructions, commands, signals and/or messages related to or corresponding tobase assembly 710 movement directly to abase motor controller 715 and/or indirectly through a processor orcontroller 711 to abase motor controller 715. For example, a motor controller and/orprocessor 227 in an AI device or computing device housing may communicate instructions and/or messages to abase motor controller 715 which may result in abase assembly 710 moving 20 feet sideways. In embodiments, communication may pass through atransceiver 714 to abase motor controller 715. In embodiments, communications may pass through a base assembly controller orprocessor 711 to abase motor controller 715. In embodiments, computer-readable instructions stored on one or more memory modules ormemories 228 of an AI device orcomputing device housing 208, may cause aprocessor 227 in an AI device orcomputing device housing 208 to receive one or more measurements from one or more sensors (including wind, temperature, humidity, air quality, directional sensors (GPS and/or digital compass) in an AI device orcomputing device housing 208, one or more shading supports 205, and/or one ormore shading elements 203; analyze the one or more received measurements; generate commands, instructions, signals and/or messages; and communicate such commands, instructions, signals and/or messages to abase assembly 710 to cause abase assembly 710 to move. For example, based on wind sensor or temperature sensor measurements, computer-readable instructions executed by aprocessor 227 of an AI device or computing device housing 708 may communicate messages to abase motor controller 715 in abase assembly 710 to cause thebase assembly 710 to move away from a detected wind direction and/or condition. For example, based on received solar power measurements (from one or more solar panel assemblies) and/or a directional sensor reading (e.g., a digital compass reading or GPS reading), aprocessor 227 executing computer-readable instructions in anAI device housing 208 may communicate messages and/or instructions to abase motor controller 715 to cause abase assembly 710 to automatically move in a direction where solar panels may capture more solar power. This provides an AI device with shading system with an advantage because not only can an AI device with a shading system rotate towards a light source (e.g., via a motor assembly in an AI Device or computing device Housing 208), an entire AI device with shading system also has an ability to move to an area where no obstacles or impediments are present, or where no unfavorable conditions are present because thebase assembly 710 is movable from one location to another. - In embodiments, a portable or mobile computing device 723 (e.g., smart phone, mobile communications device, a laptop, and/or a tablet) and/or a
computing device 729 may transmit commands, instructions, messages and/or signals to abase assembly 710 identifying desired movements of abase assembly 710. In embodiments, a portable ormobile computing device 723 and/or acomputing device 729 may comprise computer-readable instructions stored in a memory of aportable computing device 723 orcomputing device 729 and executed by a processor (e.g., SMARTSHADE software) that communicates with an AI Device withShading System 700 as is described supra herein. In embodiments, computer-readable instructions executed by a processor of amobile computing device 723 may be part of a client-server software application that also has computer-readable instructions stored on a server and executed by a processor of a server (e.g., computing device 729). In embodiments, computer-readable instructions executed by a processor of amobile computing device 723 may be part of a client-server software application that also has computer-readable 240 instructions stored on amemory 228 and executed by aprocessor 227 of anAI device housing 208 of an AI device andshading system mobile computing device 723. In embodiments, a computer-readable instructions executed by a processor of amobile computing device 723 may communicate instructions, commands and/or messages directly to abase assembly 710 via a wireless transceiver (e.g., awireless transceiver 724 on amobile computing device 723 may communicate commands and/or messages to atransceiver 714 on a base assembly 710). - In embodiments, voice commands may be converted on a
mobile computing device 723 and instructions and/or messages based at least in part on the voice commands may be transmitted (e.g., via a wireless transceiver 724) to a baseassembly motor controller 715 directly (e.g., through a wireless transceiver 714), or indirectly via awireless transceiver 714 and/or abase assembly processor 711 to automatically move abase assembly 710 in a specified direction and/or distance or to specified coordinates. In embodiments, amobile computing device 723 may communicate instructions, messages and/or signals corresponding to voice commands and/or audio files to a baseassembly motor controller 715 directly, or indirectly as described above. In embodiments, where audio files are received, computer-readable instructions 713 stored in abase assembly memory 712 may be executed by abase assembly processor 711 to convert the voice commands into instructions, signals and/or messages recognizable by a baseassembly motor controller 715. Similarly, if audio files are received by aprocessor 227 in anAI Device housing 208, computer-readable instructions 240 stored in amemory 228 may be executed by an AIdevice housing processor 227 to convert voice commands into instructions, signals and/or messages recognizable by a baseassembly motor controller 715. In embodiments, computer-readable instructions executed by a processor on amobile computing device 723 may present a graphical representation of abase assembly 710 on a mobile computing device display. In embodiments, amobile computing device 723 may receive commands via a user interface from a user representing directions and/or distance to move a base assembly (e.g., a user may select a graphic representation of a base assembly on a display of a mobile computing device and indicate that it should move to a left or east direction approximately 15 feet) and computer-readable instructions executed by a processor amobile computing device 723 may communicate commands, instructions and/or messages representative of a base assembly movement directions and/or distance directly and/or indirectly to a baseassembly motor controller 715 to cause movement of abase assembly 710 in the selected direction and/or distance. Similarly, themobile computing device 723 may communicate commands, instructions and/or messages to a processor in anAI device housing 208, which in turn will communicate commands, instructions and/or messages to a to a baseassembly motor controller 715 to cause movement of abase assembly 710 in the selected direction and/or distance. This feature may provide an advantage of independently moving a base assembly 710 (and thus an AI device or computing device and shading system) from a remote location without having to be next to or in proximity to a base assembly. - In embodiments, a
transceiver 714 and/or a transceiver may be a WiFi (e.g, an 802.11 transceiver), a cellular transceiver, and/or a personal area network transceiver (e.g., Bluetooth, Zigbee transceiver) so that a mobile computing device 723 (and its wireless transceiver 724) may communicate with abase assembly 710 via a number of ways and/or protocols. In embodiments, amobile computing device 723 may utilize an external server (e.g., a computing device or server computer 729) to communicate with abase assembly 710. -
FIG. 7B is a flowchart illustrating base assembly movement according to voice commands according to embodiments. In embodiments, abase assembly 710 may move in response to voice commands. In embodiments, voice-recognition software (e.g., computer-readable instructions) may be stored in amemory 712 of a base assembly and executed by abase assembly processor 711 to convert 771 actual voice commands (spoken by an operator) or received voice audio files into messages, instructions and/or signals which can then be communicated 772 to abase motor controller 715. In embodiments, abase motor controller 715 may generate commands or messages and communicate commands or messages 773 abase assembly 710 to move in a direction and/or distance based at least in part on received voice commands and/or audio files. In embodiment, a voice recognition application programming interface (API) may be stored in amemory 712 of abase assembly 710. In embodiments, a voice recognition API may be executed by aprocessor 711 and voice commands and/or voice audio files from a base assembly may be communicated 774 to an external server (e.g., via a wireless transceiver 714) or other network interface. In embodiments, voice recognition software may be present or installed on an external server (e.g., computing device 729) and may process 775 the received voice commands and/or voice audio files and convert the processed voice files into instructions and/or messages, which may then be communicated 776 back to abase assembly 710. In embodiments, the communicated instructions, commands and/or messages from an external voice recognition server (e.g., computing device 729) may be received at abase assembly 710 and transferred and/or communicated (e.g., via atransceiver 714 and/or a processor 711) 777 to abase motor controller 715 to cause abase assembly 710 to move directions and/or distances based at least in part on the received voice commands. Similarly, voice recognition of received voice commands and/or audio files, as discussed above, may be performed at an AI device or computing device housing 208 (e.g., utilizing computer-readable instructions 240 stored in memories 228) and/or at a mobile computing device 723 (e.g., utilizing computer-readable instructions stored in memories of a mobile computing device 723) or combination thereof, and converted instructions, commands and/or messages may be communicated to abase motor controller 715 to cause movement of a base assembly in specified directions and/or distances. The ability of abase assembly 710 to move in response to voice commands allows an advantage of a shading system to move quickly (and be communicated with via a variety of interfaces) with specific and customizable instructions without having a user physically exert themselves to move an umbrella and/or shading system to a proper and/or desired position. -
FIG. 7C illustrates movement of a base assembly according to sensor measurements according to embodiments. In embodiments, abase assembly 710 may comprise one or more sensors (e.g., environmental sensors 721 (wind, temperature, humidity and/or air quality sensors); direction sensors 722 (e.g., compass and/or GPS sensors); and/orproximity sensors 719. In embodiments, in addition or as an alternative, an AI device orcomputing device housing 208 may comprise one or more environmental sensors, directional sensors and/or proximity sensors mounted thereon and/or installed therein. In embodiments, in addition or as an alternative, an external hardware device (e.g., a portable computing device 723) or other computing devices (e.g., that are part of home security and/or office building computing systems or computing device 729) may comprise directional sensors, proximity sensors, and/or environmental sensors that communicate with an AI device or computing device andshading system 700 and/or abase assembly 710. In embodiments,sensors 722 may be located within abase assembly 710 may capture 781 measurements of environmental conditions and/or location information adjacent to and/or surrounding thebase assembly 710. In embodiments, one ormore sensors 722 may communicate 782 sensor measurements to a processor and/orcontroller 711. In embodiments, computer-readable instructions 713 stored in amemory 712 of a base assembly may be executed by a processor and/orcontroller 711 and may analyze 783 sensor measurements. In embodiments, based on the analyzation of sensor measurements, computer-readable instructions 713 may generate 784 movement direction values and distance values and/or instructions for abase assembly 710. In embodiments, computer-readable instructions executed by aprocessor 711 may communicate 785 the generated direction values and/or distance values and/or instructions to a baseassembly motor controller 715, which generates messages, commands, and/or signals to cause 786 a drive assembly (e.g., a motor, shaft and/or wheels or a motor, shaft and/or treads) to move abase assembly 710 based at least in part on the generated direction values and/or distance values and/or instructions. - In embodiments, environmental sensors and/or directional sensors may be located on an AI device or
computing device housing 208, external hardware devices (e.g., portable computing device 723) and/or external computing devices (e.g., computing device or server 729). In embodiments, intelligent shading system sensors and external device sensors may capture 787 environmental measurements (e.g., wind, temperature, humidity, air quality) and/or location measurements (e.g., latitude and/or longitude; headings, altitudes, etc.) and may communicate captured measurements or values to processors and/or controllers in respective devices (e.g., AI device orcomputing device housing 208,portable computing device 723 or external computing devices 729). In embodiments, computer-readable instructions executed by processors and/or controllers of anAI device housing 208,portable computing device 723 and/orexternal computing device 729 may analyze sensor measurements and generate movement values or instructions (e.g., direction values and/or distance values) and/or may communicate sensor measurements (or generated movement values or instructions) 788 to abase assembly 710 utilizing transceivers in intelligent shading systems, portable computing devices (e.g., transceiver 723) and/or external computing devices (e.g., computing device 729) and one or morebase assembly transceivers 714. In other words, either sensor measurements, analyzed sensor measurements and/or movement instructions may be communicated to abase assembly 710. In embodiments, some or all of the steps of 783-786 may be repeated for the received sensor measurements and/or movement instructions received from an AI device housing sensors, external hardware device sensors, portable computing device sensors and/or external computing device sensors, which results in movement of abase assembly 710 based on the received sensor measurements or instructions. -
FIG. 7D illustrates movement of a base assembly utilizing a camera and/or pattern recognition and/or image processing according to embodiments. In embodiments, a base assembly ormovable base assembly 710 may comprise one ormore cameras 726 and may utilize pattern recognition and/or image processing to identify potential base movement. In embodiments, in addition or as an alternative, an AI device or computing device andshading system 700 may comprise one or more cameras 739 located thereon and/or within and may communicate images, video and/or sound with abase assembly 710. In embodiments, in addition or as an alternative, an external hardware device (e.g., a portable computing device 723) or other computing devices 729 (e.g., that are part of home security and/or office building computing systems) may comprise one or more cameras that communicate images, videos and/or sounds/audio to an AI device or computing device andshading system 700 and/or abase assembly 710. - In embodiments, one or
more cameras 726 located within abase assembly 710, one ormore cameras 126 in an AI device and shading system, aportable computing device 723 and/or a remote computing or hardware device (e.g., 729 (may capture 791 images, videos and/or sounds adjacent to and/or surrounding abase assembly 710 and/or AI device or computing device housing orbody 207. In embodiments, one ormore cameras 726 in abase assembly 710, one or more cameras in an AI device and shading system, one or more cameras in aportable computing device 723 and/or remote computing device (e.g., computing device 729) may communicate 792 captured images to a processor and/orcontroller 711 in abase assembly 710. In embodiments, computer-readable instructions 713 stored in amemory 712 of abase assembly 710 may be executed by a processor and/orcontroller 711 and may analyze 793 captured images to determine if any patterns and/or conditions are recognized as requiring movement of an AI device or computing device andshading system 700 via movement of abase assembly 710. In embodiments, based on the analyzation and/or pattern recognition of captured images, video and/or sounds, computer-readable instructions 713 may generate 794 movement direction values and/or distance values and/or instructions for abase assembly 710. In embodiments, computer-readable instructions executed by aprocessor 711 may communicate 795 generated direction values and/or distance values and/or instructions to a baseassembly motor controller 715, which generates messages, commands, and/or signals to cause and/or activate 796 a drive assembly (e.g., a motor, shaft and/or wheels or a motor, shaft and/or treads) to move abase assembly 710 based at least in part on the generated direction values and/or distance values. In embodiments, computer-readable instructions executed by a processor of an AI device and shading system, aportable computing device 723 and/or acomputing device 729 may receive images, videos and/or sounds from cameras on abase assembly 710, an AI device or computing device andshading system 700, aportable computing device 723 and/or acomputing device 729, analyze the received images, videos and/or sounds, and may generate 797 direction values and/or distance values or instructions for base assembly movement. In other words, image recognition or pattern recognition may be performed at any of the discussed assemblies or computing devices (e.g.,base assembly 710,portable computing device 723,external computing device 729 and/or AI device or computing device andshading system 700. In embodiments, computer-readable instructions executed by processors of an AI device or computing device andshading system 700, amobile computing device 723 and/or acomputing device 729 may communicate 798 base assembly direction values and distance values to abase assembly 710 via a transceiver. - In embodiments, a base assembly processor/
controller 715 may receive generated direction values and/or distance values and/or instructions, which generates messages, commands, and/or signals to cause 796 a drive assembly (e.g., a motor, shaft and/or wheels or a motor, shaft and/or treads) to move abase assembly 710 based at least in part on the generated direction values and/or distance values and/or instructions. - In embodiments, one or
more sensors base assembly 700 may generate sensor readings or measurements In embodiments, a controller or processor and/or atransceiver 714 may communicate commands, instructions, signals and/or messages to abase motor controller 715 to identify movements and/or directions for abase assembly 700. In response, a shading system controller send commands, instructions, and/or signals to abase assembly 710 identifying desired movements of a base assembly. - In embodiments, a
base assembly 710 may comprise a processor/controller 711, amotor controller 715, amotor 716 and/or adrive assembly 717 which physical move abase assembly 710. As described above, many different components, systems and/or assemblies may communicate instructions, commands, messages and/or signals to aprocessor 711 and/or a baseassembly motor controller 715. In embodiments, the instructions, commands, messages and/or signals may correspond to, be related to and/or indicative of direction values and/or distance values that abase assembly 710 may and/or should move. In embodiments, abase motor controller 715 may receive direction values and distance values or instructions and convert these pulses into signals, commands and/or messages for a motor and/orturbine 716. In embodiments, a motor and/orturbine 716 may be coupled, attached and/or connected to a drivingassembly 717. In embodiments, a drivingassembly 717 may drive abase assembly 710 to a location based at least in part on direction values and/or distance values. In embodiments, a drivingassembly 717 may comprise one or more shafts, one or more axles and/one or more wheels 718. In embodiments, amotor 716 generates signals to cause shafts to rotate, axles to rotate, and/or wheels to spin and/or rotate which causes abase assembly 710 to move. In embodiments, a drivingassembly 717 may comprise one or more shafts, one or more conveying devices and one or more treads (e.g., tread assemblies). In embodiments, amotor 716 may generates signals, messages and/or commands to cause one or more shafts to rotate, which may cause one or more conveying devices to rotate, which in turns causes treads (and/or tread assemblies) to rotate and travel about a conveying device, where the one or more treads (and/or tread assemblies) cause abase assembly 710 to move. - In embodiments, a motor and drive assembly may be replaced by an air exhaust system and air exhaust vents. In embodiments, a
motor controller 715 may be replaced by an exhaust system controller. In embodiments, an exhaust system controller may receive instructions, commands, messages and/or signals from a controller identifying movement distances and directional measurements for abase assembly 710. In embodiments, an exhaust system controller may convert the commands, messages and/or signals into signals and/or commands understandable by exhaust system components. In embodiments, an exhaust system (or exhaust system components) may control operation of air exhaust events on abase assembly 710 in order to move a base assembly a desired direction and/or distance. In embodiments, abase assembly 710 may hover and/or glide over a surface when being moved by operation of exhaust vents. - In embodiments, a SMARTSHADE and/or SHADECRAFT application) or a desktop computer application may transmit commands, instructions, and/or signals to a
base assembly 710 identifying desired movements of abase assembly 710. In embodiments, abase motor controller 715 may receive commands, instructions, and/or signals and may communicate commands and/or signals to abase motor 716. In embodiments, abase motor 716 may receive commands and/or signals, which may result in rotation of a motor shaft. In embodiments, a motor shaft may be connected, coupled, or indirectly coupled (through gearing assemblies or other similar assemblies) to one or more drive assemblies. In embodiments, a drive assembly may be one or more axles, where one or more axles may be connected to wheels. In embodiments, for example, a base assembly may receive commands, instructions and/or signal to rotate in a counterclockwise direction approximately 15 degrees. In embodiments, for example, a motor output shaft would rotate one or more drive assemblies rotate a base assembly approximately 15 degrees. In embodiments, a base assembly may comprise more than one motor and/or more than one drive assembly. In this illustrative embodiment, each of motors may be controlled independently from one another and may result in a wider range or movements and more complex movements. - A computing device may be a server, a computer, a laptop computer, a mobile computing device, a mobile communications device, and/or a tablet. A computing device may, for example, include a desktop computer or a portable device, such as a cellular telephone, a smart phone, a display pager, a radio frequency (RF) device, an infrared (IR) device, a Personal Digital Assistant (PDA), a handheld computer, a tablet computer, a laptop computer, a set top box, a wearable computer, wearable haptic and touch communication device, a wearable haptic device, a non-wearable computing device having a touch-sensitive display, a remote computing device, a single board computer, and/or an integrated computing device combining various features, such as features of the forgoing devices, or the like.
-
FIG. 8 illustrates various components of an example computing device 600 that can be implemented as a mobile computing device, integrated computing device, server, cloud-based server and/or remote computing devices described inFIGS. 1-7 . These devices may include some but not all of the components identified below. The device may be implemented as one or combination of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, gaming, audio, messaging, Web browsing, paging, media playback, and/or other type of computing device. - Electronic or
computing device 800 includescommunication transceivers 802 that enable wired and/or wireless communication ofdevice data 804, such as received data over a low power wireless protocol or an Ethernet wired protocol. Other example communication transceivers include NFC transceivers, WPAN radios or transceivers compliant with various IEEE 802.15 (Bluetooth™) standards, WLAN radios or transceivers compliant with any of the various IEEE 802.11 (WiFi™) standards, WWAN (3GPP, 4G or 5G-compliant) radios or transceivers for cellular telephony, wireless metropolitan area network (WMAN) radios or transceivers compliant with various IEEE 802.16 (WiMAX™) standards, and wired local area network (LAN) Ethernet transceivers. -
Electronic device 800 may also include one or moredata input ports 806 or interfaces via which any type of data, media content, and/or inputs may be received, such as user-selectable inputs, messages, signals, instructions, music, television content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source. Data input ports orinterfaces 806 may include USB ports, coaxial cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, SD memory connectors, network (Ethernet) connectors, DVDs, CDs, and the like. These data input ports may be used to couple the electronic device to components, peripherals, or accessories such as keyboards, microphones, flash drives, external hard drives, and/or cameras. - Computing device or
electronic device 800 of this example may include one or more processor systems or processors 808 (e.g., any of application processors, microprocessors, digital-signal-processors, controllers, and the like), or a processor and memory system (e.g., implemented in a SoC), which process (i.e., execute) computer-executable or computer-readable instructions to control operation of the device. Processor system 808 (processor(s) 808) may be implemented as an application processor, embedded controller, single-board computer, microcontroller, and the like. A processing system may be implemented at least partially in hardware, which can include components of an integrated circuit or on-chip system, digital-signal processor (DSP), application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon and/or other hardware. For example, in various embodiments,processors 808 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the x86, PowerPC, SPARC, Pentium, or MIPS ISAs, or any other suitable ISA. In multiprocessor systems, each ofprocessors 808 may commonly, but not necessarily, implement the same ISA. Alternatively or in addition, the electronic device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits, which are generally identified at 810 (processing and control 810). Although not shown, electronic device orcomputing device 800 can include a system bus, crossbar, mesh, mesh network, or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a serial bus, other component sub architecture, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. - Electronic device or
computing device 800 may also include one ormore memory devices 812 that enable data storage, examples of which include random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. One ormore memory devices 812 may be configured to store instructions and data accessible by processor(s) 808. In embodiments,system memory 812 may be implemented using any suitable memory technology, such as static random access memory (SRAM), dynamic RAM (DRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. One or more memory device(s) 812 provide data storage mechanisms to store thedevice data 804, other types of information and/or data, and various device applications 814 (e.g., software applications) (which may be implemented in code or computer-executable instructions). For example,operating system software 816 may be maintained as software instructions within one ormore memory devices 812 and executed byprocessors 808. -
Electronic device 800 may also include audio and/orvideo processing system 818 that processes audio data and/or passes through the audio and video data toaudio system 820 and/or to display system 822 (e.g., monitors, displays, screens, wearable computing displays (e.g., on spectacles and/or a display on a wearable computing device, and so on) tooutput content 818.Audio system 820 and/ordisplay system 822 may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF (radio frequency) link, S-video link, HDMI (high-definition multimedia interface), composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In embodiments,audio system 820 and/ordisplay system 822 may be external components to electronic device or computing device 800 (and/or may be connected or attached directly) to electronic devices orcomputing devices 800. Alternatively or additionally,audio system 820 and/ordisplay system 822 may be an integrated component of the example electronic device orcomputing device 800, such as part of an integrated touch interface (in the case of a display system 822). In embodiments, electronic device orcomputing device 800 may further comprise anetwork interface 850 coupled to I/O interface or input/output port 806 and/or directly toprocessor 808. - In embodiments, I/O ports or
interface 806 may be configured to coordinate I/O traffic between one ormore processors 808, one ormore memory devices 812, and any peripheral devices in the device, includingnetwork interface 850 or other peripheral interfaces. In some embodiments, I/O interface orport 806 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory or memory device 812) into a format suitable for use by another component (e.g., processor 808). In embodiments, I/O ports orinterfaces 806 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard, a serial component interface, or the Universal Serial Bus (USB) standard, for example. In embodiments, the function of I/O interface 806 may be split into two or more separate components, for example. Also, in some embodiments some or all of the functionality of I/O interface 806, such as an interface tosystem memory 812, may be incorporated directly intoprocessor 808. - In embodiments,
network interface 850 may be configured to allow data to be exchanged between electronic device orcomputing device 800 and other devices 860 attached to a network ornetworks 855, such as other computer devices, remote computing devices, servers, cloud-based devices as illustrated inFIGS. 1 through 7 , for example. In embodiments,network interface 850 may support communication via any suitable wired or wireless general data networks, such as types of Ethernet network, for example. Additionally,network interface 850 may support communication via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks, via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol. Further, a computer-accessible medium may include transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link, such as may be implemented vianetwork interface 850. - Memory, in a computing device and/or a modular umbrella shading system, interfaces with computer bus and/other communication channels, so as to provide information stored in memory to processor during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code or logic, and/or computer-executable process steps, incorporating functionality described herein, e.g., one or more of process flows described herein. CPU first loads and/or accesses computer-executable process or method steps or logic from storage, storage medium/media, removable media drive, and/or other storage device. CPU can then execute the stored process steps in order to execute the loaded computer-executable process steps. Stored data, e.g., data stored by a storage device, can be accessed by CPU during the execution of computer-executable process steps.
- Non-volatile storage medium/media is a computer readable storage medium(s) that can be used to store software and data, e.g., an operating system and one or more application programs, in a computing device or one or more memory devices of an intelligent umbrella and/or robotic shading system. Persistent storage medium/media also be used to store device drivers, (such as one or more of a digital camera driver, motor drivers, speaker drivers, scanner driver, or other hardware device drivers), web pages, content files, metadata, playlists, data captured from one or more assemblies or components (e.g., sensors, cameras, motor assemblies, microphones, audio and/or video reproduction systems) and other files. Non-volatile storage medium/media can further include program modules/program logic in accordance with embodiments described herein and data files used to implement one or more embodiments of the present disclosure.
- A computing device or a processor or controller may include or may execute a variety of operating systems, including a personal computer operating system, such as a Windows, iOS or Linux, or a mobile operating system, such as iOS, Android, or Windows Mobile, Windows Phone, Google Phone, Amazon Phone, or the like. A computing device, or a processor or controller in an intelligent shading controller may include or may execute a variety of possible applications, such as a software applications enabling communication with other devices, such as communicating one or more messages such as via email, short message service (SMS), or multimedia message service (MMS), FTP, or other file sharing programs, including via a network, such as a social network, including, for example, Facebook, LinkedIn, Twitter, Flickr, or Google+, Instagram and/or Ato provide only a few possible examples. A computing device or a processor or controller in an intelligent shading object may also include or execute an application to communicate content, such as, for example, textual content, multimedia content, or the like. A computing device or a processor or controller in an intelligent umbrella or robotic shading system may also include or execute an application to perform a variety of possible tasks, such as browsing, searching, playing various forms of content, including locally stored or streamed content. The foregoing is provided to illustrate that claimed subject matter is intended to include a wide range of possible features or capabilities. A computing device or a processor or controller in an intelligent shading object and/or mobile computing device may also include imaging software applications for capturing, processing, modifying and transmitting image, video and/or sound files utilizing the optical device (e.g., camera, scanner, optical reader) within a mobile computing device and/or an intelligent umbrella or robotic shading system.
- Network link typically provides information communication using transmission media through one or more networks to other devices that use or process the information. For example, network link may provide a connection through a network (LAN, WAN, Internet, packet-based or circuit-switched network) to a server, which may be operated by a third party housing and/or hosting service. For example, the server may be the server described in detail above. The server hosts a process that provides services in response to information received over the network, for example, like application, database or storage services. It is contemplated that the components of system can be deployed in various configurations within other computer systems, e.g., host and server.
- For the purposes of this disclosure a computer readable medium stores computer data, which data can include computer program code that is executable by a computer, in machine-readable form. By way of example, and not limitation, a computer-readable medium may comprise computer readable storage media, for tangible or fixed storage of data, or communication media for transient interpretation of code-containing signals. Computer readable storage media, as used herein, refers to physical or tangible storage (as opposed to signals) and includes without limitation volatile and non-volatile, removable and non-removable media implemented in any method or technology for the tangible storage of information such as computer-readable instructions, data structures, program modules or other data. Computer readable storage media includes, but is not limited to, DRAM, DDRAM, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other physical or material medium which can be used to tangibly store the desired information or data or instructions and which can be accessed by a computer or processor.
- For the purposes of this disclosure a system or module is a software, hardware, or firmware (or combinations thereof), process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation). A module can include sub-modules. Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.
- Those skilled in the art will recognize that the methods and systems of the present disclosure may be implemented in many manners and as such are not to be limited by the foregoing exemplary embodiments and examples. In other words, functional elements being performed by single or multiple components, in various combinations of hardware and software or firmware, and individual functions, may be distributed among software applications at either the client or server or both. In this regard, any number of the features of the different embodiments described herein may be combined into single or multiple embodiments, and alternate embodiments having fewer than, or more than, all of the features described herein are possible. Functionality may also be, in whole or in part, distributed among multiple components, in manners now known or to become known. Thus, myriad software/hardware/firmware combinations are possible in achieving the functions, features, interfaces and preferences described herein. Moreover, the scope of the present disclosure covers conventionally known manners for carrying out the described features and functions and interfaces, as well as those variations and modifications that may be made to the hardware or software or firmware components described herein as would be understood by those skilled in the art now and hereafter.
- While certain exemplary techniques have been described and shown herein using various methods and systems, it should be understood by those skilled in the art that various other modifications may be made, and equivalents may be substituted, without departing from claimed subject matter. Additionally, many modifications may be made to adapt a particular situation to the teachings of claimed subject matter without departing from the central concept described herein. Therefore, it is intended that claimed subject matter not be limited to the particular examples disclosed, but that such claimed subject matter may also include all implementations falling within the scope of the appended claims, and equivalents thereof.
Claims (20)
1. An apparatus to provide shade, comprising:
a computing device housing, the computing device housing comprising:
one or more microphones, the one or more microphones to capture audio sounds;
one or more processors;
one or more memory modules; and
computer-readable instructions stored in the one or more memory modules;
a support assembly connected to a top surface of the computing device housing; and
a shading assembly connected to an end of the support assembly, the shading assembly to provide shade from environmental conditions to the computing device housing,
wherein the computer-readable instructions to be executable by the one or more processors to convert the captured audio sounds from the one or more microphones to one or more audio files.
2. The apparatus of claim 1 , further comprising one or more wireless transceivers, where the computer-readable instructions are executed by the one or more processors to communicate the one or more audio files, via the one or more wireless transceivers, to an external computing device for voice recognition and conversion into command files.
3. The apparatus of claim 2 , wherein the computer-readable instructions are executed by the one or more processors to receive the command files via the one or more wireless transceivers from the external computing device to control operation of one or more assemblies or components of the computing device housing.
4. The apparatus of claim 3 , wherein the one or more assemblies or the components are one or more environmental sensors, and
wherein the computer-readable instructions are executed by the one or more processors to instruct the one or more environmental sensors to capture environmental measurements and communicate the captured environmental measurements.
5. The apparatus of claim 4 , wherein the computer-readable instructions to be executed by the one or more processors to analyze the captured environmental measurements, generate commands to control operation of another assembly or component in the computing device housing and to communicate the generated commands to the another assembly or component.
6. The apparatus of claim 3 , wherein the one or more assemblies or the components are one or more directional sensors, wherein the computer-readable instructions are executed by the one or more processors to instruct the one or more directional sensors to capture directional measurements and communicate the captured direction measurements.
7. The device of claim 3 , wherein the one or more assemblies or components are one or more imaging devices, and
wherein the computer-readable instructions are executed by the one or more processors to activate the one or more imaging devices to capture video and/or audio of an environment surrounding the computing device and to communicate the captured video and/or audio to a portable computing device, via one or more of the wireless transceivers.
8. The apparatus of claim 3 , further comprising one or more cameras and one or more motion sensors, the one or more motion sensors to detect movement in an area around the apparatus, to generate and communicate signals to the one or more processors,
wherein the one or more computer-readable instructions are executed by the one or more processors to activate the one or more cameras based, at least in part, on the detected movement.
9. The apparatus of claim 1 , further comprising a rotation assembly, the rotation assembly connected to the computing device housing and a support assembly, the rotation assembly to rotate the shading support and the shading element with respect to the computing device housing.
10. The apparatus of claim 1 , further comprising a first hinging assembly, a second hinging assembly, and an additional support assembly, the additional support assembly connected to the shading element and connected to the computing device housing via the second hinging assembly, the support assembly connected to the computing device housing via the first hinging assembly.
11. The apparatus of claim 1 , further comprising a rotation assembly and a hinging assembly, the shading support further comprising an upper shading section and a lower shading section, the rotation assembly connecting the shading device housing to the lower shading section and the hinging assembly connecting the lower shading section to the upper shading section, the rotation assembly to cause the shading support and the shading element to rotate in an azimuth direction with respect to the device housing, the hinging assembly to cause the upper shading section to rotate with respect to the lower shading support.
12. The apparatus of claim 1 , wherein the computer-readable instructions are executed by the one or more processors to receive the one or more voice files, and perform voice recognition on the one or more voice files to generate command files.
13. The apparatus of claim 12 , wherein the computer-readable instructions are further executed by the one or more processors to communicate the command files to the one or more processors to control operation of one or more assemblies or components of the computing device housing.
14. The apparatus of claim 1 , the shading element further comprising one or more photovoltaic cells to convert sunlight into electrical power.
15. The apparatus of claim 1 , the computing device hosing further comprising one or more photovoltaic cells to convert sunlight into electrical power.
16. The apparatus of claim 14 , further comprising a charging assembly and a rechargeable power source, the charging assembly to receive the electrical power and to transfer the electrical power to the rechargeable power source, the rechargeable power source to provide voltage or current, or both to components and/or assemblies in the shading device housing.
17. An apparatus to provide shade, comprising:
a base assembly to contact a surface;
a computing device housing, the computing device housing comprising:
one or more microphones, the one or more microphones to capture audio sounds;
one or more processors;
one or more memory modules; and
computer-readable instructions stored in the one or more memory modules;
one or more support assemblies connected to a top surface of the computing device housing and
a shading element connected to the one or more shading supports the shading assembly to provide protection from environmental conditions to the computing device housing,
a rotation assembly connected to the base assembly and connected to the computing device housing, the rotation assembly to cause rotation of the computing device housing with respect to the base assembly;
wherein the computer-readable instructions executable by the one or more processors to convert the captured audio sounds from the one or more microphones to one or more audio files.
18. The apparatus of claim 17 , further comprising one or more wireless transceivers, wherein the computer-readable instructions are executed by the one or more processors to communicate the one or more audio files, via the one or more wireless transceivers, to an external computing device for voice recognition and conversion into command files, to receive the converted command files from the external computing device, and wherein the converted commands files comprise instructions to cause the rotation assembly to move or rotate the computing device housing with respect to the base assembly.
19. An apparatus to provide shade, comprising:
a computing device housing, the computing device housing comprising:
one or more microphones, the one or more microphones to capture audio sounds;
one or more processors;
one or more memory modules; and
computer-readable instructions stored in the one or more memory modules;
a support assembly connected to a top surface of the computing device housing;
one or more wireless transceivers to communicate with a mobile computing device; and
a shading assembly connected to an end of the support assembly, the shading assembly to provide shade from environmental conditions to the computing device housing,
wherein the computer-readable instructions executable by the one or more processors to receive audio files captured by one or more microphones of the mobile computing device.
20. The apparatus of claim 19 , where the computer-readable instructions are executed by the one or more processors to communicate the one or more audio files, via the one or more wireless transceivers, to an external computing device for voice recognition and conversion into command files.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/823,404 US20180329375A1 (en) | 2017-05-13 | 2017-11-27 | Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762505910P | 2017-05-13 | 2017-05-13 | |
US15/823,404 US20180329375A1 (en) | 2017-05-13 | 2017-11-27 | Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180329375A1 true US20180329375A1 (en) | 2018-11-15 |
Family
ID=64096088
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/669,964 Active US10488834B2 (en) | 2017-05-13 | 2017-08-07 | Intelligent umbrella or robotic shading system having telephonic communication capabilities |
US15/823,404 Abandoned US20180329375A1 (en) | 2017-05-13 | 2017-11-27 | Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/669,964 Active US10488834B2 (en) | 2017-05-13 | 2017-08-07 | Intelligent umbrella or robotic shading system having telephonic communication capabilities |
Country Status (2)
Country | Link |
---|---|
US (2) | US10488834B2 (en) |
WO (1) | WO2019032475A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110829963A (en) * | 2019-11-14 | 2020-02-21 | 杭州煜伟科技有限公司 | Photovoltaic panel capable of following illumination angle |
CN113284492A (en) * | 2021-05-19 | 2021-08-20 | 江苏中信博新能源科技股份有限公司 | Voice-controlled solar tracker debugging method and device and solar tracker |
US20210319098A1 (en) * | 2018-12-31 | 2021-10-14 | Intel Corporation | Securing systems employing artificial intelligence |
US11560754B1 (en) * | 2018-03-22 | 2023-01-24 | AI Incorporated | Artificial neural network based controlling of window shading system and method |
US11676368B2 (en) | 2020-06-30 | 2023-06-13 | Optum Services (Ireland) Limited | Identifying anomalous activity from thermal images |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102384643B1 (en) * | 2017-08-17 | 2022-04-08 | 엘지전자 주식회사 | Electric terminal and method for controlling the same |
US10306959B2 (en) * | 2017-09-06 | 2019-06-04 | August Rosedale | Motorized solar-tracking umbrella base |
US11585111B2 (en) * | 2019-09-27 | 2023-02-21 | Inhabit Solar, Llc | Solar carport |
CR20220579A (en) * | 2020-04-27 | 2023-02-08 | L3Vel Llc | Mounting apparatus for wireless communication equipment |
US11264943B1 (en) * | 2020-09-24 | 2022-03-01 | JBC Technologies, LLC | Portable sun tracking system |
US11725390B2 (en) * | 2021-04-30 | 2023-08-15 | Jackson Design & Remodeling, Inc. | Louvered patio cover control system |
Family Cites Families (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US138774A (en) | 1873-05-13 | Improvement in tent-frames | ||
US2485118A (en) | 1948-03-29 | 1949-10-18 | Doyle H Simpson | Ventilated metal umbrella |
ES2020210B3 (en) | 1986-03-14 | 1991-08-01 | Roder Gmbh | SELF-MOUNTING STORE. |
US5161561A (en) | 1991-05-30 | 1992-11-10 | Jamieson Bruce W | Outdoor service system |
US5273062A (en) | 1992-12-21 | 1993-12-28 | Peter Mozdzanowski | Umbrella |
US5318055A (en) | 1993-10-04 | 1994-06-07 | Olaniyan Olajide O | Shoulder supported umbrella apparatus |
US6554012B2 (en) | 1999-05-24 | 2003-04-29 | Samuel F. Patarra | Portable cooler apparatus with umbrella mounting means |
US6405742B1 (en) | 1999-07-19 | 2002-06-18 | James J. Driscoll | Portable sun shade |
US6511033B2 (en) | 2000-05-10 | 2003-01-28 | Wanda Ying Li | Rotation locker stand for outdoor umbrellas |
US7753546B2 (en) | 2001-02-07 | 2010-07-13 | World Factory, Inc. | Umbrella apparatus |
US6536721B1 (en) | 2001-05-18 | 2003-03-25 | Boto (Licenses) Limited | Revolving support stand with electrical power outlet |
US20030000557A1 (en) | 2001-06-27 | 2003-01-02 | Jin-Sheng Lai | Structure of lightening umbrella |
US20030000559A1 (en) | 2001-06-29 | 2003-01-02 | Hung-Ming Wu | Parasol with rechargeable battery device |
US6575183B2 (en) | 2001-08-09 | 2003-06-10 | Benson Tung | Tiltable and rotatable canopy frame for a sunshade |
US6837255B2 (en) | 2002-08-13 | 2005-01-04 | Bunch Colette M | Illuminated umbrella assembly having self-contained and replacable lighting |
CA2503528A1 (en) * | 2002-10-31 | 2004-05-13 | Nokia Corporation | A communication apparatus and a method of indicating receipt of an electronic message, and a server, a method and a computer progam product for providing a computerized icon ordering service |
US20040103934A1 (en) | 2002-12-01 | 2004-06-03 | Thomas Szumlic | Umbrella and mount assembly for wheelchair |
US6845780B2 (en) | 2002-12-13 | 2005-01-25 | Charles A. Bishirjian | Personal canopy apparatus |
US7407178B2 (en) | 2003-01-09 | 2008-08-05 | Rashell Freedman | Automated canopy positioning system |
US7021598B2 (en) | 2003-02-24 | 2006-04-04 | Boto (Licenses) Limited | Revolving support stand for decorative display |
US6923193B2 (en) | 2003-06-24 | 2005-08-02 | Shiow-Hui Chen | Outdoor used stand frame of an umbrella |
CN2652206Y (en) | 2003-07-24 | 2004-11-03 | 吴伟淡 | Electric sunshade suspending umbrella |
US7003217B2 (en) | 2003-11-19 | 2006-02-21 | Hon Technology Inc. | Infrared heating system for patio umbrella |
EP1731055A1 (en) | 2004-03-11 | 2006-12-13 | Dazalan, S.L. | Folding parasol |
GB0406563D0 (en) | 2004-03-24 | 2004-04-28 | Taylor John H | Improved parasol |
US20050279396A1 (en) | 2004-06-17 | 2005-12-22 | Choi Young S | Shoulder mounted head shade |
NL1026727C2 (en) | 2004-07-26 | 2006-02-02 | Patrick Franciscus J Loosbroek | Beach umbrella. |
US8461958B2 (en) | 2005-08-17 | 2013-06-11 | Wireless Data Solutions, Llc | System for monitoring and control of transport containers |
US7431469B2 (en) | 2005-12-01 | 2008-10-07 | Wanda Ying Li | Power supplying system for outdoor umbrella |
GR1006220B (en) | 2006-04-20 | 2009-01-13 | Soukos Robots Abee | Electronic walking stick for people with eyesight problems. |
US20070283987A1 (en) | 2006-06-07 | 2007-12-13 | Enlightened Innovations | Solar powered umbrella |
US7787697B2 (en) * | 2006-06-09 | 2010-08-31 | Sony Ericsson Mobile Communications Ab | Identification of an object in media and of related media objects |
US9345295B2 (en) | 2006-09-01 | 2016-05-24 | Oliver Joen-An Ma | Outdoor umbrella with built-in electro control panel |
US20080056898A1 (en) | 2006-09-01 | 2008-03-06 | Wanda Ying Li | Outdoor umbrella with ventilation arrangement |
US7778624B2 (en) | 2006-09-01 | 2010-08-17 | Wanda Ying Li | Outdoor umbrella with audio system |
US9565387B2 (en) | 2006-09-11 | 2017-02-07 | Apple Inc. | Perspective scale video with navigation menu |
US7891633B2 (en) | 2007-05-29 | 2011-02-22 | Wanda Ying Li | Adjustable rotation base |
US8672287B2 (en) | 2007-05-29 | 2014-03-18 | Oliver Joen-An Ma | Adjustable rotation base |
US20090058354A1 (en) | 2007-09-04 | 2009-03-05 | Soren David Harrison | Solar-powered media system and apparatus |
US7726326B2 (en) | 2008-01-16 | 2010-06-01 | Paul A. Crabb | Umbrella with repositionable grip |
US8345889B2 (en) | 2008-04-29 | 2013-01-01 | Oliver Joen-An Ma | Wireless transmission-AV system of outdoor furniture |
US7926496B2 (en) | 2008-05-30 | 2011-04-19 | Resort Umbrella Solutions, Llc | Apparatus and method for holding and tilting an umbrella |
ES2351246B1 (en) | 2008-07-11 | 2011-11-21 | Jose Ramon Arenas Garcia | BACKPACK UMBRELLA. |
US8061374B2 (en) | 2008-11-28 | 2011-11-22 | Wanda Ying Li | Intelligent outdoor sun shading device |
US8413671B2 (en) | 2008-11-28 | 2013-04-09 | Oliver Joen-An Ma | Intelligence outdoor shading arrangement |
WO2010098735A1 (en) | 2009-02-27 | 2010-09-02 | Wanda Ying Li | Outdoor shading device |
US20130048829A1 (en) | 2009-11-10 | 2013-02-28 | Edward Herniak | Solar concentrator positioning system and method |
CN201580588U (en) | 2009-12-03 | 2010-09-15 | 郑汝升 | Vacuum workroom clamping bag and bag opening device for vacuum-packing machine |
WO2011140557A1 (en) | 2010-05-07 | 2011-11-10 | Arizona Board Of Regents, A Body Corporate Of The State Of Arizona, Acting For And On Behalf Of Arizona State University | A flexible system for car shading |
EP2623354B1 (en) | 2010-09-29 | 2015-10-21 | Toyota Jidosha Kabushiki Kaisha | Fuel tank system |
US9243747B2 (en) | 2010-10-15 | 2016-01-26 | Charles E. Ramberg | Shade structure |
US8387641B1 (en) | 2011-04-08 | 2013-03-05 | Nily Ilan | Motor operated wheelchair umbrella |
FR2977457B1 (en) | 2011-07-08 | 2014-05-16 | Francois Solari | PARASOL COMPRISING ELEMENTS FOR CREATING SHADOW |
CN102258250B (en) | 2011-07-22 | 2013-07-10 | 临海市美阳伞业有限公司 | Solar electric remote control sunshading suspended umbrella and control method thereof |
US9222693B2 (en) | 2013-04-26 | 2015-12-29 | Google Inc. | Touchscreen device user interface for remote control of a thermostat |
CN202974544U (en) | 2012-12-20 | 2013-06-05 | 南京信息工程大学 | Portable digital barometer |
CN203073199U (en) | 2012-12-31 | 2013-07-24 | 孙春晓 | Cooling sunshade |
US20140317168A1 (en) | 2013-04-17 | 2014-10-23 | Telefonaktiebolaget L M Ericsson (Publ) | System, method, and device for exposing wireless module data storage |
CN103405009B (en) | 2013-07-09 | 2016-08-10 | 雨中鸟(福建)户外用品有限公司 | A kind of intelligent integrated parachute kit |
US20150136944A1 (en) | 2013-11-21 | 2015-05-21 | Avraham Segev | Sunlight tracking sensor and system |
US9289039B2 (en) | 2014-01-06 | 2016-03-22 | Zon | Sunshades with solar power supplies for charging electronic devices |
US20150237975A1 (en) | 2014-02-27 | 2015-08-27 | James Ng | Umbrella-like Device Using Flexible Ribs |
US9526306B2 (en) | 2014-02-28 | 2016-12-27 | Ellen D. Fitzgerald | Umbrella assembly |
US9293806B2 (en) * | 2014-03-07 | 2016-03-22 | Apple Inc. | Electronic device with display frame antenna |
US20160119699A1 (en) * | 2014-10-22 | 2016-04-28 | Rachel CABAN | Umbrella mounted sound system |
CN104469162A (en) | 2014-12-23 | 2015-03-25 | 天津天地伟业数码科技有限公司 | Dome camera voice control method |
CN106163041A (en) | 2015-04-22 | 2016-11-23 | 海洋王(东莞)照明科技有限公司 | Voice light fixture ON-OFF control circuit |
US20160326765A1 (en) | 2015-05-07 | 2016-11-10 | Scott Barbret | Systems and methods for providing a portable weather, hydration, and entertainment shelter |
US10327521B2 (en) | 2015-05-22 | 2019-06-25 | Armen Sevada Gharabegian | Intelligent shading objects |
CN104835334B (en) | 2015-05-26 | 2016-03-09 | 山东鑫宏光电科技有限公司 | A kind of multifuctional solar supply intelligent monitoring traffic signals unmanned commander hilllock |
CN105193034A (en) | 2015-08-16 | 2015-12-30 | 上海电机学院 | Rain-wind-solar complementary generating multifunctional umbrella |
CN204889001U (en) | 2015-08-28 | 2015-12-23 | 浙江永强集团股份有限公司 | Multi -functional electronic sun shade of solar energy |
US10389014B2 (en) * | 2016-01-21 | 2019-08-20 | Geelux Holdings, Ltd. | Antenna configuration for mobile communication device |
US11124970B2 (en) * | 2017-04-07 | 2021-09-21 | Carefree/Scott Fetzer Company | Enhanced awning canopy assembly |
-
2017
- 2017-08-07 US US15/669,964 patent/US10488834B2/en active Active
- 2017-11-27 US US15/823,404 patent/US20180329375A1/en not_active Abandoned
-
2018
- 2018-08-06 WO PCT/US2018/045435 patent/WO2019032475A1/en active Application Filing
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11560754B1 (en) * | 2018-03-22 | 2023-01-24 | AI Incorporated | Artificial neural network based controlling of window shading system and method |
US20210319098A1 (en) * | 2018-12-31 | 2021-10-14 | Intel Corporation | Securing systems employing artificial intelligence |
CN110829963A (en) * | 2019-11-14 | 2020-02-21 | 杭州煜伟科技有限公司 | Photovoltaic panel capable of following illumination angle |
US11676368B2 (en) | 2020-06-30 | 2023-06-13 | Optum Services (Ireland) Limited | Identifying anomalous activity from thermal images |
CN113284492A (en) * | 2021-05-19 | 2021-08-20 | 江苏中信博新能源科技股份有限公司 | Voice-controlled solar tracker debugging method and device and solar tracker |
Also Published As
Publication number | Publication date |
---|---|
US10488834B2 (en) | 2019-11-26 |
WO2019032475A1 (en) | 2019-02-14 |
US20180332154A1 (en) | 2018-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10349493B2 (en) | Artificial intelligence (AI) computing device with one or more lighting elements | |
US20180329375A1 (en) | Computing Device or Artificial Intelligence (AI) Device Including Shading Element or Shading System | |
US10813424B2 (en) | Intelligent shading charging systems | |
US20200113297A1 (en) | Automated umbrella | |
US10455395B2 (en) | Shading object, intelligent umbrella and intelligent shading charging security system and method of operation | |
US10650423B2 (en) | Mobile computing or communications device interacting with an intelligent umbrella and/or intelligent shading charging system | |
US20210042802A1 (en) | Mobile Computing Device Application Software Interacting with an Umbrella | |
US10542799B2 (en) | Intelligent shading system with movable base assembly | |
US10819916B2 (en) | Umbrella including integrated camera | |
US10813422B2 (en) | Intelligent shading objects with integrated computing device | |
US20170318922A1 (en) | Automatic Operation of Shading Object, Intelligent Umbrella and Intelligent Shading Charging System | |
US20180289120A1 (en) | Intelligent Umbrella and Intelligent Shading Charging System Receiving Messages from a Mobile Computing or Communications Device | |
US10912357B2 (en) | Remote control of shading object and/or intelligent umbrella | |
US20180332935A1 (en) | Methods and apparatus for adjusting shading element and/or moving umbrella assembly to maximize shading area | |
US20180268056A1 (en) | Computing Device and/or Intelligent Shading System with Color Sensor | |
WO2018195262A1 (en) | Intelligent shading system with movable base assembly | |
US20190137978A1 (en) | Intelligent Umbrella and/or Robotic Shading System Mechanical and Tracking Improvements |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |