Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATED AND INTELLIGENT SHADING SYSTEMS
Document Type and Number:
WIPO Patent Application WO/2017/196758
Kind Code:
A1
Abstract:
An intelligent shading object comprises a base assembly and an umbrella support assembly. The umbrella support assembly is coupled to the base assembly and comprises a first processor and a first motor assembly. The first motor assembly rotates an umbrella support assembly about a vertical axis in response to commands and/or signals generated by the first processor. The umbrella assembly further comprises a stem assembly and a center support assembly, where the center support assembly further comprises a second processor, a second motor assembly, a lower assembly, an upper assembly, and a hinging assembly. The second processor communicates signals and/or commands to the second motor assembly and the second motor assembly drives the upper assembly to rotate with respect to the lower assembly via the hinging assembly.

Inventors:
GHARABEGIAN ARMEN (US)
Application Number:
PCT/US2017/031625
Publication Date:
November 16, 2017
Filing Date:
May 08, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SHADECRAFT LLC (US)
International Classes:
E04H15/26; A45B17/00
Foreign References:
US20090188537A12009-07-30
JP2008212421A2008-09-18
CA2324851A12001-05-07
RU94031133A1996-05-27
Attorney, Agent or Firm:
KENDRICK, Mark (US)
Download PDF:
Claims:
IN THE CLAIMS

1 . An intelligent shading object, comprising:

a base assembly;

a stem assembly; and

a center support assembly, the center support assembly connected to the stem assembly,

wherein the stem assembly comprises a first processor positioned inside the stem assembly and a first motor assembly positioned within the stem assembly, the first motor assembly to rotate the stem assembly about a vertical axis in response to commands or signals generated by the first processor while the base assembly remains stationary.

2. The intelligent shading object of claim 1 , further comprising a wireless charging assembly, the wireless charging assembly attached to a center support assembly and charging a portable electronic device placed adjacent to the wireless charging assembly.

3. The intelligent shading object of claim 1 , wherein the first motor assembly comprises a first motor controller, a first motor and a gearbox assembly, and the first motor controller receives commands from the first processor, communicates signals to the first motor, which generates signals to drive the gearbox assembly, which causes rotation of the center support assembly about the vertical axis.

4. The intelligent shading object of claim 1 , the center support assembly further comprising a second processor, a second motor assembly, a lower assembly, an upper assembly, and a hinging assembly, wherein the second processor communicates signals and/or commands to the second motor assembly and the second motor assembly drives the upper assembly to rotate with respect to the lower assembly via the hinging assembly

l

5. The intelligent shading object of claim 4, wherein the second motor assembly comprises a second motor controller, a second motor, and the hinging assembly is a gearbox assembly, and wherein the second motor controller receives commands from the second processor, communicates signals to the second motor to drive the motor, which is coupled to the gearbox assembly and drives the gearbox assembly to move the upper assembly with respect to the lower assembly.

6. The intelligent shading object of claim 5, wherein the gearbox assembly is a right angle gearbox assembly.

7. The intelligent shading object of claim 4, wherein the umbrella support assembly further comprises a third processor, a third motor assembly, and one or more arm support assemblies, wherein the third processor communicates signals to the third motor assembly and the third motor assembly deploys the one or more arm support assemblies outwardly from the umbrella support assembly.

8. The intelligent shading object of claim 7, wherein the third motor assembly comprises a third motor controller, a third motor and a linear actuator, and the third motor controller receives commands from the third processor, communicates signals to the third motor to drive the motor, the motor being coupled to the linear actuator and drives the linear actuator to deploy the one or more arm support assemblies outwardly from the umbrella support assembly.

9. The intelligent shading object of claim 8, further comprising one or more arms/blades which are detachably attached to the one or more arm support assemblies.

10. The intelligent shading object of claim 9, further comprising a shading fabric, the shading fabric being detachably attached to the one or more arms/blades.

2

1 1 . The intelligent shading object of claim 10, further comprising one or more solar cells, the solar cells attached to the shading fabric.

12. The intelligent shading object of claim 1 1 , further comprising a solar charging assembly and a rechargeable battery, where the solar cells provide power to the rechargeable battery and the solar charging assembly monitors power generation.

13. The intelligent shading object of claim 10, further comprising or more solar cells, the solar cells being integrated into the shading fabric.

14. An article comprising:

a non-transitory computer readable medium having stored thereon computer implementable instructions executable by a processor in a computing device to:

receive measurements from one or more movement detection sensors of an intelligent shading device;

analyze and compare received sensor measurements to preset user-defined thresholds; and

communicate messages to intelligent shading device assemblies in response to detected movements.

15. The article of claim 14, wherein the assemblies are at least one of the first motor assembly, the second motor assembly, or the third motor assembly.

16. The article of claim 14, wherein in response to detected movements, further comprising computer-implementable instructions executable by the processor to the computing device to:

activate a camera to capture images, video and/or sound of an area

surrounding the intelligent shading device.

17. The article of claim 16, further comprising computer-implementable instructions executable by the processor to the computing device to:

3 receive images, video and/or sound from the camera for a predetermined period of time;

communicate images, video and/or sound to a display device; and

determine whether to communicate instructions to deactivate the camera or to continue to receive images, video and/or sound from the camera.

18. The article of claim 16, further comprising computer-implementable instructions executable by the processor to the computing device to:

receive messages and/or instructions representing sound from an external computing device; and

communicate sound messages and/or instructions to a speaker for playback at the intelligent shading device.

19. The article of claim 14, further comprising computer-implementable instructions executable by the processor to the computing device to:

receive images, videos or sounds communicated from camera within intelligent shading device;

analyze the communicated images, videos and/or sounds to determine if security response is needed; and

communicate the images to a third party computing device for security purposes or alarm response.

20. The article of claim 14, further comprising computer-implemental instructions executable by the processor of the computing device to:

communicate messages to a third party computing device to identify out-of- tolerance and/or emergency conditions and to activate a security system.

4

Description:
AUTOMATED AND INTELLIGENT SHADING SYSTEMS RELATED APPLICATIONS

[0001] This application claims priority to U.S. non-provisional patent application serial No. 15/273,669, filed September 22, 2016, entitled "Mobile Computing Device Control of Shading Object, Intelligent Umbrella and Intelligent Shading Charging System" claims priority to U.S. non-provisional patent application serial No. 15/214, 471 , filed July 20, 2016 entitled "Computer-Readable Instructions Executable by a Processor to Operate a Shading Object, Intelligent Umbrella and an Intelligent Shading Charging System", claims priority to U.S. non-provisional patent application serial No. 15/212, 173, filed July 15, 2016, entitled "Intelligent Charging Shading Systems," claims priority to U.S. non-provisional patent application serial No. 15/160,856, filed May 20, 2016, entitled "Automated

Intelligent Shading Objects and Computer-Readable Instructions for Interfacing With, Communicating With and Controlling a Shading Object," claims priority to U.S. non-provisional patent application serial No. 15/160,822, filed May 20, 2016, entitled "Intelligent Shading Objects with Integrated Computing Device," as well as claiming priority to U.S. provisional Patent Application Serial No. 62/333,822, entitled "Automated Intelligent Shading Objects and Computer-Readable

Instructions for Interfacing With, Communicating With and Controlling a Shading Object," filed May 9, 2016, all of which, in their entirety, are hereby incorporated by reference.

BACKGROUND

1 . Field

[0002] The subject matter disclosed herein computer-readable instructions executable by a processor to operate, control and/or communicate with a shading object, intelligent umbrella and an intelligent shading charging system. 2. Information/Background of the Invention [0003] Conventional sun shading devices usually are comprised of a supporting frame and an awning or fabric mounted on the supporting frame to cover a predefined area. For example, a conventional sun shading device may be an outdoor umbrella or an outdoor awning. [0004] However, current sun shading devices do not appear to be flexible, modifiable or able to adapt to changing environmental conditions, or user's desires. Many of the current sun shading devices appear to require manual operation in order to change inclination angle of the frame to more fully protect an individual from the environment. Further, the current sun shading devices appear to have one (or a single) awning or fabric piece that is mounted to an interconnected unitary frame. An interconnected unitary frame may not be able to be opened or deployed in many situations. Accordingly, alternative embodiments may be desired.

Further, current sun shading devices may not have automated assemblies to allow a shading object to track movement of a sun and/or adjust to other environmental conditions. In addition, current sun shading devices do not communicate with external shading object related systems. Further, individuals utilizing current sun shading devices are limited in interactions with users. In addition, sun shading devices generally do not have software stored therein which controls and/or operates the sun-shading device. BRIEF DESCRIPTION OF DRAWINGS

[0005] Non-limiting and non-exhaustive aspects are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various figures unless otherwise specified.

[0006] FIGURES 1A and 1 B illustrates a shading object or shading object device according to embodiments;

[0007] FIGURES 1 C and 1 D illustrate intelligent shading charging systems

according to embodiments;

[0008] FIGURE 2 illustrates a block diagram of a stem assembly according to embodiments;

[0009] FIGURE 3A illustrates a base assembly according to embodiments;

[0010] FIGURE 3B illustrates a housing and/or enclosure according to

embodiments; [0011] FIGURE 4A illustrates a block diagram of a center support assembly motor control according to embodiments;

[0012] FIGURE 4B illustrates a lower support motor assembly according to

embodiments;

[0013] FIGURE 5A illustrates a block diagram of an actuator or deployment motor in an intelligent umbrella or shading object according to embodiments;

[0014] FIGURE 5B illustrates a block diagram of an actuator or deployment motor in an intelligent shading charging system according to embodiments;

[0015] FIGURE 6A illustrates a shading object or intelligent umbrella with arm support assemblies and arms/blades in an open position and a closed positions;

[0016] FIGURE 6B illustrates an intelligent shading charging system with arm support assemblies and arms/blades in an open position and a closed position;

[0017] FIGURE 7 illustrates assemblies to deploy arms and/or blades according to embodiments;

[0018] FIGURES 8A and 8B illustrate a block diagram of a movement control

PCB according to embodiments;

[0019] FIGURE 9 illustrates a block diagram with data and command flow of a movement control PCB according to embodiments;

[0020] FIGURE 10 illustrates a shading object or umbrella computing device according to embodiments;

[0021] FIGURE 1 1 illustrates a lighting subsystem according to embodiments;

[0022] FIGURE 12 illustrates a power subsystem according to embodiments;

[0023] FIGURE 13 illustrates components and assemblies of a shading object umbrella according to embodiments.

[0024] Figure 13A and 13B illustrates placements of intelligent shading charging systems according to embodiments;

[0025] FIGURES 14A and 14B illustrate a block diagram of multiple assemblies and components or a shading object, intelligent umbrella, or intelligent shading charging system according to embodiments; [0026] Figure 15A illustrates an automated weather process according to embodiments;

[0027] Figure 15B illustrates predicting weather conditions in a weather process according to embodiments;

[0028] Figure 15C illustrates a weather data gathering process on a periodic basis according to embodiments;

[0029] Fig. 15D illustrates execution of a health process by a computing device in an intelligent umbrella or shading charging system according to

embodiments;

[0030] Figure 15E illustrates an energy process in a shading object, intelligent umbrella, and/or intelligent shading charging system implementing an energy process according to embodiments;

[0031] FIG. 15F illustrates energy generation and energy consumption process of an energy process in an intelligent umbrella and/or intelligent shading charging assembly according to embodiments;

[0032] Figure 15G illustrates energy gathering for a plurality of devices

according to embodiments;

[0033] Figure 15H illustrates object tracking in an energy process according to embodiments;

[0034] Fig. 151 illustrates a backup process for a shading object, an intelligent umbrella and/or shading charging system according to embodiments;

[0035] Figure 16A is a flowchart of a facial recognition process according to an embodiment;

[0036] Figure 16B illustrates an infrared detection process according to

embodiments;

[0037] Fig. 16C illustrates a thermal detection process according to

embodiments; and

[0038] FIGURE 16D illustrates a security process for an intelligent umbrella and/or intelligent shading charging systems according to embodiments DETAILED DESCRIPTION

[0039] In the following detailed description, numerous specific details are set forth to provide a thorough understanding of claimed subject matter. For purposes of explanation, specific numbers, systems and/or configurations are set forth, for example. However, it should be apparent to one skilled in the relevant art having benefit of this disclosure that claimed subject matter may be practiced without specific details. In other instances, well-known features may be omitted and/or simplified so as not to obscure claimed subject matter. While certain features have been illustrated and/or described herein, many modifications, substitutions, changes and/or equivalents may occur to those skilled in the art. It is, therefore, to be understood that appended claims are intended to cover any and all

modifications and/or changes as fall within claimed subject matter.

[0040] References throughout this specification to one implementation, an implementation, one embodiment, an embodiment and/or the like means that a particular feature, structure, and/or characteristic described in connection with a particular implementation and/or embodiment is included in at least one

implementation and/or embodiment of claimed subject matter. Thus, appearances of such phrases, for example, in various places throughout this specification are not necessarily intended to refer to the same implementation or to any one particular implementation described. Furthermore, it is to be understood that particular features, structures, and/or characteristics described are capable of being combined in various ways in one or more implementations and, therefore, are within intended claim scope, for example. In general, of course, these and other issues vary with context. Therefore, particular context of description and/or usage provides helpful guidance regarding inferences to be drawn.

[0041] With advances in technology, it has become more typical to employ distributed computing approaches in which portions of a problem, such as signal processing of signal samples, for example, may be allocated among computing devices, including one or more clients and/or one or more servers, via a computing and/or communications network, for example. A network may comprise two or more network devices and/or may couple network devices so that signal

communications, such as in the form of signal packets and/or frames (e.g., comprising one or more signal samples), for example, may be exchanged, such as between a server and a client device and/or other types of devices, including between wireless devices coupled via a wireless network, for example.

[0042] A network may comprise two or more network and/or computing devices and/or may couple network and/or computing devices so that signal

communications, such as in the form of signal packets, for example, may be exchanged, such as between a server and a client device and/or other types of devices, including between wireless devices coupled via a wireless network, for example. [0043] In this context, the term network device refers to any device capable of communicating via and/or as part of a network and may comprise a computing device. While network devices may be capable of sending and/or receiving signals (e.g., signal packets and/or frames), such as via a wired and/or wireless network, they may also be capable of performing arithmetic and/or logic operations, processing and/or storing signals (e.g., signal samples), such as in memory as physical memory states, and/or may, for example, operate as a server in various embodiments. Network devices capable of operating as a server, or otherwise, may include, as examples, rack-mounted servers, desktop computers, laptop

computers, set top boxes, tablets, netbooks, smart phones, wearable devices, integrated devices combining two or more features of the foregoing devices, the like or any combination thereof. As mentioned, signal packets and/or frames, for example, may be exchanged, such as between a server and a client device and/or other types of network devices, including between wireless devices coupled via a wireless network, for example. It is noted that the terms, server, server device, server computing device, server computing platform and/or similar terms are used interchangeably. Similarly, the terms client, client device, client computing device, client computing platform and/or similar terms are also used interchangeably.

While in some instances, for ease of description, these terms may be used in the singular, such as by referring to a "client device" or a "server device," the

description is intended to encompass one or more client devices and/or one or more server devices, as appropriate. Along similar lines, references to a "database" are understood to mean, one or more databases, database servers, and/or portions thereof, as appropriate.

[0044] It should be understood that for ease of description a network device (also referred to as a networking device) may be embodied and/or described in terms of a computing device. However, it should further be understood that this description should in no way be construed that claimed subject matter is limited to one embodiment, such as a computing device or a network device, and, instead, may be embodied as a variety of devices or combinations thereof, including, for example, one or more illustrative examples. [0045] Operations and/or processing, such as in association with networks, such as computing and/or communications networks, for example, may involve physical manipulations of physical quantities. Typically, although not necessarily, these quantities may take the form of electrical and/or magnetic signals capable of, for example, being stored, transferred, combined, processed, compared and/or otherwise manipulated. It has proven convenient, at times, principally for reasons of common usage, to refer to these signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals and/or the like. It should be understood, however, that all of these and/or similar terms are to be associated with appropriate physical quantities and are intended to merely be convenient labels.

[0046] Likewise, in this context, the terms "coupled", "connected," and/or similar terms are used generically. It should be understood that these terms are not intended as synonyms. Rather, "connected" is used generically to indicate that two or more components, for example, are in direct physical, including electrical, contact; while, "coupled" is used generically to mean that two or more components are potentially in direct physical, including electrical, contact; however, "coupled" is also used generically to also mean that two or more components are not

necessarily in direct contact, but nonetheless are able to co-operate and/or interact. The term coupled is also understood generically to mean indirectly connected, for example, in an appropriate context. In a context of this application, if signals, instructions, and/or commands are transmitted from one component (e.g., a controller or processor) to another component (or assembly), it is understood that signals, instructions, and/or commands may be transmitted directly to a component, or may pass through a number of other components on a way to a destination component. For example, a signal transmitted from a motor controller or processor to a motor (or other driving assembly) may pass through glue logic, an amplifier, and/or an interface. Similarly, a signal communicated through a misting system may pass through an air conditioning and/or a heating module, and a signal communicated from any one or a number of sensors to a controller and/or processor may pass through a conditioning module, an analog-to-digital controller, and/or a comparison module. [0047] The terms, "and", "or", "and/or" and/or similar terms, as used herein, include a variety of meanings that also are expected to depend at least in part upon the particular context in which such terms are used. Typically, "or" if used to associate a list, such as A, B or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B or C, here used in the exclusive sense. In addition, the term "one or more" and/or similar terms is used to describe any feature, structure, and/or characteristic in the singular and/or is also used to describe a plurality and/or some other combination of features, structures and/or characteristics. Likewise, the term "based on" and/or similar terms are understood as not necessarily intending to convey an exclusive set of factors, but to allow for existence of additional factors not necessarily expressly described. Of course, for all of the foregoing, particular context of description and/or usage provides helpful guidance regarding inferences to be drawn. It should be noted that the following description merely provides one or more illustrative examples and claimed subject matter is not limited to these one or more illustrative examples; however, again, particular context of description and/or usage provides helpful guidance regarding inferences to be drawn.

[0048] A network may also include now known, and/or to be later developed arrangements, derivatives, and/or improvements, including, for example, past, present and/or future mass storage, such as network attached storage (NAS), cloud storage, a storage area network (SAN), cloud storage, and/or other forms of computing and/or device readable media, for example. A network may include a portion of the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, one or more personal area networks (PANs), wireless type connections, other connections, or any combination thereof. Thus, a network may be worldwide in scope and/or extent.

[0049] The Internet and/or a global communications network may refer to a decentralized global network of interoperable networks that comply with the Internet Protocol (IP). It is noted that there are several versions of the Internet Protocol. Here, the term Internet Protocol, IP, and/or similar terms, is intended to refer to any version, now known and/or later developed of the Internet Protocol. The Internet may include local area networks (LANs), wide area networks (WANs), wireless networks, and/or long haul public networks that, for example, may allow signal packets and/or frames to be communicated between LANs. The term World Wide Web (WWW or Web) and/or similar terms may also be used, although it refers to a part of the Internet that complies with the Hypertext Transfer Protocol (HTTP). For example, network devices and/or computing devices may engage in an HTTP session through an exchange of appropriately compatible and/or compliant signal packets and/or frames. Here, the term Hypertext Transfer Protocol, HTTP, and/or similar terms is intended to refer to any version, now known and/or later developed. It is likewise noted that in various places in this document substitution of the term Internet with the term World Wide Web ('Web') may be made without a significant departure in meaning and may, therefore, not be inappropriate in that the statement would remain correct with such a substitution.

[0050] Although claimed subject matter is not in particular limited in scope to the Internet and/or to the Web; nonetheless, the Internet and/or the Web may without limitation provide a useful example of an embodiment at least for purposes of illustration. As indicated, the Internet and/or the Web may comprise a worldwide system of interoperable networks, including interoperable devices within those networks. A content delivery server and/or the Internet and/or the Web, therefore, in this context, may comprise an service that organizes stored content, such as, for example, text, images, video, etc., through the use of hypermedia, for example. A HyperText Markup Language ("HTML"), for example, may be utilized to specify content and/or to specify a format for hypermedia type content, such as in the form of a file and/or an "electronic document," such as a Web page, for example. An Extensible Markup Language ("XML") may also be utilized to specify content and/or format of hypermedia type content, such as in the form of a file or an "electronic document," such as a Web page, in an embodiment. HTML and/or XML are merely example languages provided as illustrations and intended to refer to any version, now known and/or developed at another time and claimed subject matter is not intended to be limited to examples provided as illustrations, of course.

[0051] Also as used herein, one or more parameters may be descriptive of a collection of signal samples, such as one or more electronic documents, and exist in the form of physical signals and/or physical states, such as memory states. For example, one or more parameters, such as referring to an electronic document comprising an image, may include parameters, such as time of day at which an image was captured, latitude and longitude of an image capture device, such as a camera, for example, etc. Claimed subject matter is intended to embrace meaningful, descriptive parameters in any format, so long as the one or more parameters comprise physical signals and/or states, which may include, as parameter examples, name of the collection of signals and/or states.

[0052] Some portions of the detailed description which follow are presented in terms of algorithms or symbolic representations of operations on binary digital signals stored within a memory of a specific apparatus or special purpose computing device or platform. In the context of this particular specification, the term specific apparatus or the like includes a general purpose computer once it is programmed to perform particular functions pursuant to instructions from program software. In embodiments, a shading object may comprise a shading object computing device installed within or as part of a shading object. Algorithmic descriptions or symbolic representations are examples of techniques used by those of ordinary skill in the signal processing or related arts to convey the substance of their work to others skilled in the art. An algorithm is here, and generally,

considered to be a self-consistent sequence of operations or similar signal processing leading to a desired result. In this context, operations or processing involve physical manipulation of physical quantities. Typically, although not necessarily, such quantities may take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared or otherwise manipulated.

[0053] It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, data, values, elements, symbols, characters, terms, numbers, numerals or the like, and that these are conventional labels.

Unless specifically stated otherwise, it is appreciated that throughout this

specification discussions utilizing terms such as "processing," "computing,"

"calculating," "determining" or the like may refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic computing device (e.g. , such as a shading object computing device). In the context of this specification, therefore, a special purpose computer or a similar special purpose electronic computing device (e.g. , a shading object computing device) is capable of manipulating or transforming signals (electronic and/or magnetic) in memories (or components thereof), other storage devices, transmission devices sound reproduction devices, and/or display devices.

[0054] In an embodiment, a controller and/or a processor typically performs a series of instructions resulting in data manipulation. In an embodiment, a

microcontroller or microprocessor may be a compact microcomputer designed to govern the operation of embedded systems in electronic devices, e.g. , an intelligent, automated shading object, and various other electronic and mechanical devices. Microcontrollers may include processors, microprocessors, and other electronic components. Controller may be a commercially available processor such as an Intel Pentium, Motorola PowerPC, SGI MIPS, Sun UltraSPARC, or Hewlett- Packard PA-RISC processor, but may be any type of application-specific and/or specifically designed processor or controller. In an embodiment, a processor and/or controller may be connected to other system elements, including one or more memory devices, by a bus. Usually, a processor or controller, may execute an operating system which may be, for example, a Windows-based operating system (Microsoft), a MAC OS System X operating system (Apple Computer), one of many Linux-based operating system distributions (e.g. , an open source operating system) a Solaris operating system (Sun), a portable electronic device operating system (e.g. , mobile phone operating systems), and/or a UNIX operating systems. Embodiments are not limited to any particular implementation and/or operating system.

[0055] The specification may refer to an automated intelligent shading object as an apparatus that provides shade to a user from weather elements such as sun, wind, rain, and/or hail. The automated intelligent shading object may also be referred to as a parasol, intelligent umbrella, sun shade, outdoor shade furniture, sun screen, sun shelter, awning, sun cover, sun marquee, brolly and other similar names, which may all be utilized interchangeably in this application. Shading objects which also have electric vehicle charging capabilities may also be referred to as intelligent shading charging systems. These terms may be utilized

interchangeably throughout the specification. The shading objects, intelligent umbrellas and shading charging systems described herein comprises many novel and non-obvious features.

[0056] Figure 1 A illustrates an intelligent shading object according to

embodiments. In embodiments, an intelligent shading object may comprise a base assembly 105, a stem assembly 106, a central support assembly 107 (including a lower assembly, a hinge assembly and/or gearbox, and/or an upper assembly), arm support assemblies 108, arms/blades 109, and/or a shading fabric 715. In embodiments, a stem assembly 106 (and a coupled central support assembly, arm support assemblies, and/or blades) may rotate within a base assembly around a vertical axis. In embodiments, an upper assembly of a center support assembly 107 may rotate up to a right angle with respect to a lower assembly of the center support assembly 107 via a gearbox or hinging mechanism, and a second motor. In embodiments, arm support assemblies 109 may deploy and/or extend from a center support assembly 107 to open a shading object. In embodiments, detachable arms/blades 109 may be attached or coupled to arm support

assemblies 108. In embodiments, a detachable shading fabric 715 may be attached or coupled to arms/blades 109.

[0057] FIGURES 1 A and 1 B illustrates a shading object or shading object device according to embodiments. In embodiments, a shading object 100 may comprise a base assembly 105, a stem assembly 106, a center support assembly 107, one or more supporting arm assemblies 108, one or more arms/blades 109, solar panels and or a shading fabric (not shown). In embodiments, a stem assembly 106, a center support assembly 107, one or more supporting arm assemblies 108, and/or one or more arms/blades 109 may be referred to as an umbrella support assembly, a shading system body and/or shading subsystem . In embodiments, a central support assembly 107 may comprise an upper assembly 1 12, a lower assembly 1 13 and a hinging assembly and/or gearbox 1 14, where the hinging assembly and/or gearbox assembly 1 14 may connect and/or couple the upper assembly 1 12 to the lower assembly 1 13. In embodiments, a base assembly 105 may rest on a ground surface in an outdoor environment. A ground surface may be a floor, a patio, grass, sand, or other outdoor environments surfaces. In embodiments, a stem assembly 106 may be placed into a top portion of a base assembly 105.

[0058] FIGURE 3A illustrates a base assembly according to embodiments. In embodiments, a base assembly 300 comprises a base enclosure 302, a detachable stem 304, a stem tip 31 1 , and a weight housing unit 306. In embodiments, a weight housing unit 306 may comprise a stem support 310, a plurality of weight housing walls 308 and a bottom housing plate 315. In embodiments, a weight housing unit 306 comprises a plurality of weight wedges 316.

[0059] In embodiments, a plurality of weight housing walls 308 may divide a weight housing plate 315 into a plurality of regions. In embodiments, a weight housing plate 315 may be divided into four, six or eight regions. In embodiments, a weight housing plate 315 may be divided into more than one region (e.g., two or more regions). In embodiments, weight wedges 316 may be placed into regions formed by weight housing walls 308 and weight housing plate 316. As is illustrated in Fig. 3, weight wedges may have a triangular-baes shape. In embodiments, weight wedges 316 may be formed of steel or stainless steel. In embodiments, weight wedges may be comprised of a solid plastic material. In embodiments, weight wedges 316 may include a space for a handle 317. In embodiments, handles 317 may allow for weight wedges to be carried by a user of a sun shade. In embodiments, a weight housing unit 305 may include resting supports 320. In embodiments, weight housing supports may allow a weight wedge 316 to not damage a surface of a weight housing plate 315. [0060] In embodiments, a weight housing wall 308 vary in height. A weight housing wall 308 may have a low height at a first end (e.g. , at an outside edge of a weight housing unit 306), as illustrated in Fig. 3. A second end of a weight housing wall 308 may have a higher height. A second end of a weight housing wall 308 be connected and/or coupled to a center support post 325. In embodiments, two, four or eight weight housing walls 308 may be connected and/or coupled to a center support post 325. In embodiments, a stem support 310 may be attached, coupled, and/or connected to a plurality of weight housing walls 308 and a center support post 325. In embodiments, a top surface of a second end of a weighting housing wall 308 and/or a top surface of a center support post 325 may be connected to a stem support 310 (e.g. , a bottom surface of a stem support 310). In embodiments, a stem support 310 may have a hole and/or opening into which a tip 31 1 may be inserted and/or placed. In embodiments, a base assembly may comprise and/or be made of a metal. In embodiments, a base assembly may comprise a lightweight metal, e.g. , aluminum, although other metals may be utilized based on degradation in environment where shading object is located. In embodiments, a base assembly 105 may be made of a composite material, a wood material, or a plastic material, or a combination thereof. In embodiments, a base assembly 105 may comprise a biodegradable material so that when a base assembly is thrown away and placed in a landfill or buried, the biodegradable material may decompose over time.

[0061] In embodiments, a base assembly 105 may comprise a battery (e.g. , a rechargeable battery). In an embodiment, a base battery may be charged and/or powered by an AC adapter which is connected at one end to a battery (or charging system such as a rectifier and/or a capacitor) and at another end to a wall outlet. In embodiments, a solar charging assembly and/or a rechargeable battery in a center support assembly 107 may provide power (e.g. , voltage and/or current) to a battery in a base assembly 105. In embodiments, a base assembly 105 battery may be a backup and/or secondary battery to a main battery in a center support assembly 107 (or stem assembly 106). [0062] In embodiments, a base assembly 105 may have an opening (e.g. , a circular or oval opening) into which a stem assembly 106 may be placed. FIGURE 2 illustrates a block diagram of a stem assembly according to embodiments. In embodiments, a stem assembly may be referred to as an automatic and/or motorized stem assembly. In embodiments, a stem assembly 106 may comprise a stem body 21 1 and a first motor assembly. In embodiments, a first motor assembly may comprise a first motor 212, a gear box assembly and/or hinging assembly 213, and/or a first motor controller 214. Although a gearbox assembly and/or hinging assembly is discussed, other connecting assemblies, gearing assemblies, actuators, etc. , may be utilized. In embodiments, a first motor controller 214 may also be referred to as a motor driver and within this specification, terms "motor driver" and "motor controller" may be used interchangeably. In embodiments, a first motor controller 214 may receive commands, instructions and/or signals requesting movement of a shading system around an azimuth axis. In embodiments, a shading system body 21 1 may rotate (e.g. , may rotate between 0 and 360 degrees about a vertical axis formed by a base assembly 1 05, a stem assembly 106, and/or a central support assembly 1 07). Reference number 140 (Fig. 1 B) illustrates a rotation of a shading system body about a vertical axis according to embodiments. In embodiments, a shading object stem assembly 106 may rotate around a vertical axis, such as vertical axis 730 in Fig. 7. In embodiments, a shading object stem assembly may rotate 360 degrees about a vertical axis. In embodiments, a shading object stem assembly 106 may rotate up to 270 degrees and/or 180 degrees about a vertical axis. In embodiments, a shading object stem assembly 106 may be limited by detents, stops and/or limiters in an opening of a base assembly 105. In embodiments, a stem assembly encoder 218 may provide location and/or position feedback to a first motor controller 214. In other words, an encoder 21 8 may verify that a certain distance and/or position has been moved by a base assembly 105 from an original position. In embodiments, encoders may be utilized in motor systems in order to feedback position and/or distance information to motor controllers and/or motors to verify a correct position has been turned. In

embodiments, encoders may have a number of positions and/or steps and may compare how much an output shaft and/or gearbox assembly has moved in order to feedback information to a motor controller. Encoders may be utilized with any motors and/or motor controllers in this application. This provides a benefit as compared to prior art umbrellas and shading objects because the intelligent shading umbrella, due to its rotation (e.g. , 360 degree rotation), can orient itself with respect to any position in a surrounding area.

[0063] In embodiments, a first motor controller 214 may communicate

commands and/or signals to a first motor 212 to cause movement of an umbrella support assembly or shading system body (e.g. , a stem assembly 106, central support assembly 107, shading arm supports 108, and/or arms/blades 109) about an azimuth axis. In this illustrative embodiment, a base assembly 105 may remain stationary while the shading system boy rotates within the base assembly 105. In other words, a shading system body is placed in an opening of a base assembly 105 and rotates while the base assembly remains stationary. In embodiments, a first motor 212 may be coupled to a gearbox assembly 213. In embodiments, a gearbox assembly 213 may comprise a planetary gearbox assembly. A planetary gearbox assembly may be comprise a central sun gear, a planet carrier with one or more planet gears and an annulus (or outer ring). In embodiments, planet gears may mesh with a sun gear while outer rings teeth may mesh with planet gears. In embodiments, a planetary gearbox assembly may comprise a sun gear as an input, an annulus as an output and a planet carrier (one or more planet gears) remaining stationary. In embodiments, an input shaft may rotate a sun gear, planet gears may rotate on their own axes, and may simultaneously apply a torque to a rotating planet carrier that applies torque to an output shaft (which in this case is the annulus). In embodiments, a planetary gearbox assembly and a first motor 212 may be connected and/or adhered to a stem assembly 105. In embodiments, an output shaft from a gearbox assembly 213 may be connected to a base assembly 105 (e.g., an opening of a base assembly). In embodiments, because a base assembly 105 is stationary, torque on an output shaft of a gearbox assembly 213 may be initiated by a first motor 212 to cause a stem assembly 106 to rotate. In embodiments, other gearbox assemblies and/or hinging assemblies may also be utilized to utilize an output of a motor to cause a stem assembly 106 (and hence an umbrella support assembly) to rotate within a base assembly 1 05. In embodiments, a first motor 212 may comprise a pneumatic motor. In other embodiments, a first motor 212 may comprise a servo motor and/or a stepper motor. [0064] In embodiments, a stem assembly 106 may be coupled and/or connected to a center support assembly 107. In embodiments, as mentioned above, a stem assembly 106 and a center support assembly 107 may both be part of an umbrella support assembly. In embodiments, a center support assembly 107 may comprise an upper assembly 1 12, a second gearbox assembly (or a linear actuator or hinging assembly) 1 14, a lower assembly 1 13, a second motor 121 , and/or a second motor controller 122. In embodiments, a second motor assembly may comprise a second motor controller 122 and a second motor 121 , and maybe a second gearbox assembly or linear actuator 1 14. In embodiments, a center support assembly 107 may also comprise a motor control PCB which may have a second motor controller 122 mounted and/or installed thereon. In embodiments, an upper assembly 1 12 may be coupled or connected to a lower assembly 1 13 of the center support assembly 107 via a second gearbox assembly 1 13. In embodiments, a second gearbox assembly 1 13 and a second motor 121 connected thereto, may be connected to a lower assembly 1 13. In embodiments, an output shaft of a second gearbox assembly 1 14 may be connected to an upper assembly 1 12. In

embodiments, as a second motor 121 operates and/or rotates, a second gearbox assembly 1 14 rotates an output shaft which causes an upper assembly 1 12 to rotate (either upwards or downwards) at a right angle from , or with respect to, a lower assembly 1 13. In embodiments utilizing a linear actuator as a hinging assembly 1 14, a steel rod may be coupled to an upper assembly 1 12 and/or a lower assembly 1 13 which causes a free hinging between an upper assembly 1 12 and a lower assembly 1 13. In embodiments, a linear actuator 1 14 may be coupled, connected, and/or attached to an upper assembly 1 12 and/or a lower assembly 1 13. In embodiments, as a second motor 121 operates and/or rotates a steel rod, an upper assembly 1 12 moves in an upward or downward direction with respect to a hinged connection (or hinging assembly) 1 14. In embodiments, a direction of movement is illustrated by reference number 160 in Fig. 1 B. In embodiments, a direction of movement may be limited to approximately a right angle (e.g. , approximately 90 degrees). In embodiments, an upper assembly 1 12 may move from a position where it is an extension of a lower assembly 1 13 (e.g. , forming a vertical center support assembly 107) to a position wherein an upper assembly 1 12 is at a right angle from a lower assembly 1 13 (and also approximately parallel to a ground surface). In embodiments, movement may be limited by a right angle gearbox or right angle gearbox assembly 1 14. In embodiments, an upper assembly 1 12 and a lower assembly 1 13 may be perpendicular to a ground surface in one position (as is shown in Fig. 1 A), but may move (as is shown by reference number 160) to track the sun (depending on location and time of day) so that an upper assembly 1 12 moves from a perpendicular position with respect to a ground surface to an angular position with respect to a ground surface and an angular position with respect to a lower assembly 1 13. In embodiments, an upper assembly s tracking sun movement between a vertical location (top of sky) and a horizontal location (horizon) and also may depend on time and location. This provides a benefit, as compared to prior art umbrellas, of automatically orienting a shading object or umbrella to positions of a sun in the sky (e.g., directly overhead, on a horizon as during sunrise and/or sunset). [0065] Fig. 1 C illustrates an intelligent shading charging system according to embodiments. In embodiments, an intelligent shading charging system provides shade to a surrounding area, coverts solar energy to solar power, and charges a rechargeable battery in an electric vehicle. In embodiments, an intelligent shading charging system 175 comprises a rechargeable battery connection interface (not shown), a housing and/or enclosure 182 including a rechargeable battery 184 and/or a transceiver 179, a lower support assembly 187, a hinging assembly or mechanism 190, and an upper support assembly 191 . In embodiments, an intelligent shading charging system 175 further comprises a base assembly (not shown). In embodiments, an intelligent shading charging system 175 may comprise one or more arm support assemblies 193, one or more arms and/or blades 194 and a shading fabric 195. In embodiments, a shading fabric 195, arms 194, and/or arm support assemblies 193 may have one or more solar cells and/or arrays 196 attached thereto, integrated therein, and/or placed thereon. In embodiments, many movements of an intelligent shading charging system may be automated. In embodiments, an intelligent shading charging system 175 may be connected and/or coupled to a power delivery system (e.g. , a power grid or a power mains) 181 . [0066] In embodiments, an automated intelligent shading charging assembly or system may comprise an interface assembly, a rechargeable apparatus (e.g. , a rechargeable battery, a base assembly (not shown), a charging port and/or interface 183 for an electric vehicle, a lower support assembly 187, an upper support assembly 191 , a hinging assembly and/or gearbox assembly 190, one or more arm support assemblies 193, one or more arms/blades 194, and/or a shading fabric 195. In embodiments, a lower support assembly 187 (and a coupled and/or connected hinging assembly 1 90, upper support assembly 193, one or more arm support assemblies 193, and/or arms/blades 194) may also rotate with respect to a housing and/or enclosure 1 82 around a vertical axis, as is illustrated by reference number 188 in Figure 1 C. In embodiments, an upper support assembly 1 91 may rotate up to a right angle (e.g. , 90 degrees) with respect to a lower support assembly 187 of the center via a gearbox or hinging mechanism 190. In

embodiments, one or more arm support assemblies 193 may deploy and/or extend from an upper support assembly 191 to open an intelligent shading charging system 175. In embodiments, one or more detachable arms/blades 194 may be attached or coupled to one or more arm support assemblies 193. In embodiments, a detachable shading fabric 195 may be attached or coupled to one or more arms/blades 194. [0067] In embodiments, a housing and/or enclosure 182 including a

rechargeable battery 184, an electric vehicle charging port 183, a transceiver 179, and/or a charging interface may rest or be inserted into a ground surface in an outdoor environment. In embodiments, a ground surface may be a floor, a patio, grass, sand, cement, an outdoor plaza, a parking garage surface, or other outdoor environment surfaces. In embodiments, a rechargeable battery interface may be integrated into a ground surface and a rechargeable battery 184 (or an enclosure or housing including a rechargeable battery) may rest on a ground surface.

[0068] In embodiments, an intelligent shading charging system 175 may comprise a housing 182 and/or enclosure. In embodiments, a housing and/or enclosure 182 may comprise a rechargeable battery 183, a charging port 183, a wireless transceiver 179 and/or a base assembly. In embodiments, a rechargeable battery may be enclosed in a housing and/or enclosure 182. In embodiments, a base assembly may be enclosed in a housing and/or enclosure 182. In

embodiments, a housing and/or enclosure 182 may be comprised of a cement, wood, metal, stainless steel, and/or hard plastic material.

[0069] In embodiments, a lower support assembly 187 may comprise one or more first lighting assemblies 199. In embodiments, one or more first light assemblies 199 may be integrated into a lower support assembly 187. In embodiments, one or more first light assemblies 199 may be connected to, adhered to, coupled to, and/or attached to a lower support assembly 1 87. In embodiments, one or more light assemblies 199 may direct light downward to a housing and/or enclosure 182 including a rechargeable battery 184 and/or a charging port 183 as well as an area surrounding an intelligent shading charging system 175. This allows an intelligent shading charging system to be utilized even at night or in a dark environment in a public environment and not utilize power from an electrical grid. In alternate embodiments, one or more first lighting assemblies 199 may be installed in an upper support assembly 191 and/or a shading fabric 196.

[0070] In embodiments, an intelligent shading charging system may comprise a second lighting subsystem 198. In embodiments, an intelligent shading charging system upper support assembly 191 may comprises a second lighting subsystem 198 integrated therein and/or installed and/or mounted thereon. In embodiments, a second lighting subsystem 198 may be integrated into an upper support assembly 191 . In embodiments, a second lighting subsystem 198 may be connected to, adhered to, coupled to, and/or attached to an upper support assembly 191 . In embodiments, a second lighting subsystem 198 may comprise a plurality of LED lights. In embodiments, a second lighting subsystem 198 may be integrated into and/or attached to arm support assemblies 193. In embodiments, a second lighting subsystem 198 may direct light in a downward manner directly towards or at a certain angle to a ground surface and/or where a charging electric vehicle is located. In embodiments, a second lighting subsystem 198 may direct light beams outward (e.g. , in a horizontal direction) from an upper support assembly 191 . In embodiments, for example, a second lighting subsystem 198 may direct light at a 90 degree angle from an upper support assembly 191 vertical axis. In

embodiments, a second lighting subsystem 198 (e.g., one or more LED lights) may be installed in a swiveling assembly and the second lighting subsystem 198 may transmit and/or direct light (or light beams) at an angle of 5 to 185 degrees from an intelligent upper support vertical axis. In embodiments, one or more LED lights in a second lighting subsystem 198 may be directed to shine lines in an upward direction (e.g. , more vertical direction) towards arms / blades 194 and/or a shading fabric 195 of an intelligent shading charging system. In embodiments, a bottom surface of a shading fabric 195, arms / blades 194 and/or arm support assemblies 193 may reflect light beams from one or more LED lights of a second lighting subsystem 198 back to a surrounding area of an intelligent shading charging system. In an embodiment, a shading fabric 195, arms/blades 194 and/or arm support assemblies 193 may have a reflective bottom surface to assist in reflecting light from the LED lights back to a shading area. In alternate embodiments, a second lighting subsystem 198 may be installed in or attached to a lower support assembly 187 and/or in a shading fabric 195. [0071] FIGURE 1 D illustrates a power charging station 250 comprising one or more automated intelligent shading charging systems installed in an outdoor or indoor environment according to embodiments. In embodiments, a power charging station 250 may comprise one or more intelligent shading charging systems 260 (or electric vehicle supply equipment (EVSE)) and one or more reservoir batteries 262 connected, attached and/or coupled a power supply system 265 (e.g. , a power mains grid). In embodiments, one or more intelligent shading charging systems 260 may comprise a rechargeable apparatus 270 (e.g. , a rechargeable battery), an intelligent shading charging assembly or system 275 and a solar power system 285 (e.g. , a photovoltaic (PV) array or a solar power array). In embodiments, an intelligent shading charging assembly or system 275 may be portable and/or detachable from an enclosure and/or housing 182 including a rechargeable apparatus 270 (e.g. , rechargeable battery). In embodiments, an intelligent shading charging assembly or system 275 may be portable and/or detachable from a base assembly, which is coupled, connected, attached in a housing 182, which may also include a rechargeable apparatus 270 (battery).

[0072] As shown in FIGURE 1 D, an intelligent shading charging systems 260 may be coupled, connected and/or interfaced with a power supply system 265, such as an electricity mains grid 265. In embodiments, a power supply company may transfer, transmit or communicate power to an electricity mains grid 265. In embodiments, an intelligent shading charging system 260 may include a car charging interface 280. In embodiments, an electric vehicle charging interface 280 may be coupled and/or connected to vehicle battery (e.g., a vehicle rechargeable battery).

[0073] In embodiments, a plaza, a parking garage, an open-air parking lot, an outdoor sports complex, a mall parking lot, a store parking lot, a school, a university, and/or other large outdoor facilities may include an electric vehicle charging station 250 which comprises a plurality of electric vehicle charging systems 260. Figure 1 D illustrates a station with four electrical vehicle charging systems connected to one another. In embodiments, an electric vehicle charging system may be referred to as an EVSE (electric vehicle supply equipment) and also may be referred to as an intelligent shading charging system. In embodiments, a computing device or a plurality of computing devices may control operation of one or more intelligent shading charging systems at an electric vehicle charging station, such as a station at a parking lot at a shopping mall. In embodiments, the electric vehicle charging station also provides shade for electric vehicles, wireless communication capabilities (via wireless transceivers 179) in intelligent shading charging system, as well as interfaces to computing devices located in intelligent shading charging systems 260 and/or external computing devices. In

embodiments, for example, an operator of one or more intelligent shading charging systems 175 may charge users, electric vehicle users, or third parties for global communications network access (e.g., Internet usage access) as well as electric vehicle charging. In outdoor environments, e.g., a shopping plaza, a parking lot, an outdoor sporting location or an event outdoor location, this may provide an additional revenue source. In addition, an operator and/or use may also charge for providing images, videos, and/or sounds to third parties. These capabilities installed on shading objects, intelligent umbrellas, and intelligent shading charging systems are a marked improvement on existing outdoor locations such as shopping parking lots, parking lots, outdoor sporting locations and event locations generally do not provide wireless communication capabilities and/or electric vehicle recharging capabilities.

[0074] In embodiments, an intelligent shading charging system 260, when offline (e.g. , not providing power to an electric vehicle) may feed and/or transfer power to a power supply system, such as a mains power grid 265. In embodiments, an intelligent shading charging system may transfer up to 2, 4, 6 or 8 kilowatt hours of power back to a mains power grid. In embodiments, an electric vehicle charging station 250 may generate revenue by selling excess power back to the power company. In embodiments, current owners of parking lots, building plazas, athletic and/or event fields having EVSE have to pay a power company for power utilized to charge an electric vehicle (e.g. , $ 100 a month / $ 1 ,200 a year or $ 200 a month or $ 2,400 a year). However, because an intelligent shading charging system 260 obtains power from the sun (e.g. , converts solar energy into solar power), recharging an electric vehicle's battery may not cost an owner of an intelligent shading charging system 260 and/or station 250 anything or a minimal amount because the power is self-generating and there is little or no need to obtain power from a mains power grid 265. Thus, the intelligent shading charging system 260 (and/or power station 250) may be a revenue generator for an owner which may be multiplied if an electric vehicle charging station owner has a plurality of intelligent shading charging systems at a location (any of the outdoor locations listed above).

[0075] In embodiments, an intelligent shading charging system may charge an electric vehicle in two, four and/or eight hours if an electric vehicle arrives with little or no charge/power in its rechargeable battery. In embodiments, if one intelligent shading charging system does not have enough power in its rechargeable battery 184 to charge an electric vehicle connected to its charging port 183, a rechargeable battery in another intelligent shading charging system 260 at the electric vehicle charging station 250 (such as the one illustrated in Figure 1 D) may provide power to the rechargeable battery in the initial intelligent shading charging system. In embodiments, in an electric vehicle charging station, one or more intelligent shading charging systems 260 (and thus one or more rechargeable batteries) may be connected in series with a capability of providing backup power for other intelligent shading charging systems to power electric vehicles connected to the intelligent shading charging systems. In embodiments, a reservoir battery (and/or reservoir charging assembly) 262 may be charged by and/or provide power to connected and/or coupled shading charging systems 260. In embodiments, a reservoir battery may be a rechargeable battery, a capacitor or similar rechargeable assemblies.

[0076] In embodiments, an intelligent shading charging system 260 may comprise a power conversion subsystem or power converter or a power converter. In embodiments, a power conversion subsystem may receive power from a power supply system 265 and may output DC power to a rechargeable battery 270. In embodiments, a power conversion subsystem may comprise an AC-to-DC converter, a DC-to-DC converter and/or regulator and a digital control system. In embodiments, an AC-to-DC converter may convert AC power from an electrical grid to DC power. In embodiments, converted power from the AC-to-DC converter may be regulated by a DC-to-DC converter. The power output from the DC-to-DC converter may be transferred or transmitted to a rechargeable battery 270. In embodiments, a digital control system may controls operations of a DC-to-DC converter and an AC-to-DC converter.

[0077] FIGURE 3B illustrates a housing and/or enclosure of an intelligent shading charging system according to embodiments. . In embodiments, a housing and/or enclosure 350 may comprise a base assembly 351 , a power supply system interface 355, a rechargeable apparatus (e.g. rechargeable battery) 365, a power converter 360, and/or a vehicle charging port 370. In embodiments, a rechargeable battery 365 may be coupled and/or electrically connected to a recharging port 370 via or utilizing a connector cable 371 or other electrically conductive wire and/or cable apparatus. In embodiments, a converter may also be coupled and/or electrically connected to a recharging port 370 to provide power (from a power supply system such as mains power 265 (see Fig. 1 D) to an electric and/or hybrid vehicle if a rechargeable battery 365 does not supply enough power (e.g. , a solar array has not charged a rechargeable battery 365 to a sufficient level). In embodiments, a power supply system 357 may provide additional power to a rechargeable battery 365. In embodiments, a power supply system 357 may be coupled and/or connected to a rechargeable battery 365 via a power interface 355 and/or a converter 360. In embodiments, a rechargeable battery 365 may provide, transfer or deliver excess power back to a power supply system 357. In

embodiments, a rechargeable battery 365 may provide power to assemblies, components, and/or other assemblies in an intelligent shading charging system 160 (in some embodiments, e.g. , via a power charging assembly 378). In

embodiments, a power supply system 357 may provide power to assemblies, components, and/or other assemblies in an intelligent shading charging system. In embodiments, a solar power array 377 may provide power to assemblies, components, and/or other assemblies in an intelligent shading charging system. In embodiments, a solar power array 377 may provide power to rechargeable battery 365 via a power charging assembly 378 in an intelligent charging shading system .

[0078] In embodiments, a base assembly 351 may have an opening (e.g. , a circular or oval or other shaped opening) into which a lower support assembly 187 (or a shaft of a lower support assembly 187) may be placed, inserted, and/or connected. In embodiments, a base assembly 351 may be stationary and a lower support assembly 187 (and a remainder of an intelligent shading charging system) may rotate about a base assembly 351 .

[0079] In embodiments, a housing and/or enclosure 350 may comprise and/or be made of a metal. In embodiments, a housing and/or enclosure 350 may comprise a lightweight metal, e.g. , aluminum, although other metals may be utilized based on degradation in environment where shading object is located. In

embodiments, a housing and/or enclosure 350 may be made of a composite material, a wood material, a cement material, or a plastic material, or a combination thereof. In embodiments, a housing and/or enclosure 350 may comprise a biodegradable material so that when a base assembly is thrown away and placed in a landfill or buried, the biodegradable material may decompose over time.

[0080] Figure 4A illustrates a block diagram of a center support assembly motor control according to embodiments. A center support assembly 107 further comprises a second motor assembly, where a second motor assembly comprises a second motor controller 410 (which may or may not be installed on a shading object movement control PCB) and a second motor 412. In embodiments, a second controller 410 may receive commands, signals, and/or instructions from a shading object movement control PCB 895 (see Figures 8A and 8B), and/or a computing device, to control operation of a second motor 412. In embodiments, a second controller 410 may generate commands and/or signals causing a second motor 412 to drive its output shaft and engage a gearbox assembly 414 (or linear actuator), which in turn causes an upper assembly 1 12 of a center support assembly 107 to rotate and/or move with respect to a lower assembly 1 13. In embodiments, a second motor or actuator 412 may drive a hinging assembly 414 to move with respect to a stem assembly 106. In embodiments, an upper assembly 1 12 may move up to 90 degrees (or at a right angle) with respect to a lower assembly 1 13. In embodiments, a second motor or actuator 412 may comprise a pneumatic motor. In other embodiments, a second motor or actuator 412 may comprise a servo motor and/or a stepper motor. In embodiments, an encoder may be utilized for feedback of position information to a second motor controller or driver 410. In embodiments, an upper assembly 1 12 of a center support assembly 107 may further comprise and/or house an arm support assembly 108. In embodiments, arms and/or blades 109 may be connected, coupled and/or attached to a center support assembly 107.

[0081] Figure 4B illustrates a lower support motor assembly according to embodiments. In embodiments, such as illustrated in Figure 1 C, a lower 187 assembly may comprise a first motor assembly and/or a second motor assembly. In embodiments, a first motor assembly may comprise a first motor 212, a gear box assembly and/or hinging assembly 213, and/or a first motor controller 214. In embodiments, a second motor assembly may comprise a second motor 412, a second motor driver controller 410, an upper assembly 191 , and a gearbox assembly 414. In embodiments, such as illustrated in Figure 1 C, a lower 187 assembly may comprise a first motor assembly and/or a second motor assembly. In embodiments, a first motor assembly may comprise a first motor 212, a gear box assembly and/or hinging assembly 213, and/or a first motor controller 214. In embodiments, a second motor assembly may comprise a second motor 412, a second motor driver controller 410, an upper assembly 191 , and a gearbox assembly 414. In embodiments, a first motor controller 214 and/or a second motor controller 410 may also be referred to as motor drivers and within this specification, terms "motor driver" and "motor controller" may be used interchangeably. Although descriptions herein may describe a first motor controller 214 and a second motor controller 410, one motor controller may be able to perform functions and/or operations of both a first motor controller 214 and a second motor controller 410. In addition, one motor may perform functions and/or operations of both a first motor 212 and a second motor 412.

[0082] In embodiments, a first motor controller 214 may receive commands, instructions, messages and/or signals requesting movement of a shading charging system around an azimuth axis. In embodiments, a lower support assembly 187 may rotate about a vertical axis (e.g. , may rotate between 0 and 360 degrees about a vertical axis). Reference number 188 (Fig. 1 C) illustrates a rotation of a lower support assembly 187 about a vertical axis according to embodiments. In embodiments, a lower support assembly 1 87 may rotate 360 degrees about a vertical axis with respect to a housing and/or enclosure 182. In embodiments, a lower support assembly 187 may rotate up to 270 degrees and/or 180 degrees about a vertical axis with respect to a housing and/or enclosure 182. In

embodiments, a lower assembly 187 may have its rotation be limited by detents, stops and/or limiters in an opening of a housing and/or assembly 182 (e.g. , an opening in a base assembly). In embodiments, an encoder 218 may provide location and/or position feedback of a lower support assembly 187 to a first motor controller 214. In embodiments, encoders may be utilized in motor systems in order to feedback position and/or distance information to motor controllers and/or motors to verify a correct position has been reached or moved to.

[0083] In embodiments, a first motor controller 214 may communicate

commands, messages and/or signals to a first motor 212 to cause movement of a shading charging system (e.g. , a lower support assembly 1 87, an upper support assembly 199, one or more shading arm supports 193, and/or one or more arms/blades 194) about an azimuth axis. In embodiments, for example, an enclosure and/or housing 182 may remain stationary while a shading charging system rotates within a base assembly within a housing and/or enclosure 182. In other words, in an illustrative embodiment, a shading charging system may be placed in an opening of a base assembly of an enclosure and/or housing 182 and may rotate while the base assembly (and thus the enclosure and/or housing 182) remains stationary.

[0084] In embodiments, a first motor 212 may be coupled to a gearbox assembly 213. In embodiments, a gearbox assembly 213 may comprise a planetary gearbox assembly. In embodiments, a planetary gearbox assembly may be comprise a central sun gear, a planet carrier with one or more planet gears and an annulus (or outer ring). In embodiments, planet gears may mesh with a sun gear while outer rings teeth may mesh with planet gears. In embodiments, a planetary gearbox assembly may comprise a sun gear as an input, an annulus as an output and a planet carrier (one or more planet gears) remaining stationary. In embodiments, an input shaft may rotate a sun gear, planet gears may rotate on their own axes, and may simultaneously apply a torque to a rotating planet carrier that applies torque to an output shaft (which in this case is the annulus). In embodiments, a planetary gearbox assembly and a first motor 212 may be connected, attached and/or adhered to a lower support assembly 187. In embodiments, an output shaft from a gearbox assembly 213 may be connected and/or coupled to a housing and/or enclosure 182 (e.g. , a base assembly (e.g. , an opening of a base assembly)). In embodiments, because a housing and/or enclosure 182 is stationary (e.g., a base assembly may stationary), torque on an output shaft of a gearbox assembly 213 may be initiated by a first motor 212 to cause a lower support assembly 1 87 to rotate. In embodiments, other gearbox assemblies and/or hinging assemblies may also be utilized to utilize an output of a motor to cause a lower support assembly (and hence an intelligent shading charging system) to rotate within an enclosure and/or housing 182 (e.g. , a base assembly). In embodiments, a first motor 212 may comprise a pneumatic motor. In other embodiments, a first motor 212 may comprise a servo motor and/or a stepper motor. In embodiments, as is illustrated in Fig. 1 C, a lower support assembly 187 may comprise a second gearbox assembly (or a linear actuator or hinging assembly) 190 or 414, a second motor 412, and/or a second motor controller 410. In embodiments, a second motor assembly may comprise a second motor controller 410 and a second motor 412, and maybe a second gearbox assembly or linear actuator 414. In embodiments, a lower assembly 187 may also comprise a motor control PCB which may have a second motor controller 410 mounted and/or installed thereon. In embodiments, an upper support assembly 191 may be coupled or connected to a lower assembly 187 via a second gearbox assembly or linear actuator 190 or 414. In embodiments, a second gearbox assembly and/or linear actuator 414 and a second motor 412 connected thereto, may be connected to a lower assembly 187. In embodiments, an output shaft of a second gearbox assembly or linear actuator 414 may be connected to an upper support assembly 191 . In embodiments, as a second motor 412 operates and/or rotates, a second gearbox assembly 190 or 414 rotates an output shaft which causes an upper assembly 191 to rotate (either upwards or downwards) at a right angle from, or with respect to, a lower assembly 187. In embodiments utilizing a linear actuator as a hinging assembly 190, a steel rod may be coupled to an upper support assembly 191 and/or a lower assembly 187 which causes a free hinging between an upper support assembly 191 and a lower support assembly 187. In embodiments, a linear actuator 190 or 414 may be coupled, connected, and/or attached to an upper support assembly 191 and/or a lower support assembly 187. In embodiments, as a second motor 412 operates and/or rotates a steel rod, an upper support assembly 191 moves in an upward or downward direction with respect to a hinged connection (or hinging assembly) 190 or 414. In embodiments, a direction of movement is illustrated by reference number 192 in Fig. 1 C. In embodiments, a direction of movement may be limited to approximately a right angle (e.g. , approximately 90 degrees). In embodiments, an upper assembly 191 may move from a position where it is a vertical extension of a lower assembly 187 (e.g. , forming a vertical support assembly) to a position wherein an upper assembly 191 is at a right angle from a lower assembly 187 (and also approximately parallel to a ground surface). In embodiments, movement may be limited by a right angle gearbox or right angle gearbox assembly 190 and/or 414. In embodiments, an upper assembly 191 and a lower assembly 187 may be perpendicular to a ground surface in one position, but may move (as is shown by reference number 192) to track the sun (depending on location and time of day) so that an upper support assembly 192 moves from a perpendicular position with respect to a ground surface to an angular position with respect to a ground surface and an angular position with respect to a lower support assembly 187.

[0085] In embodiments, a lower support assembly 187 may comprise a second motor assembly. In embodiments, a second motor assembly comprises a second motor controller 410 (which may or may not be installed on a shading object movement control PCB) and a second motor 412. In embodiments, a second controller 410 may receive commands, signals, messages, and/or instructions from a shading object movement controller (which may or may not be located on a shading object movement PCB 895 (see Figures 8A and 8B), and/or a computing device, to control operation of a second motor 412. In embodiments, a second controller 410 may generate commands, messages and/or signals causing a second motor 412 to drive its output shaft and engage a gearbox assembly 414 (or linear actuator), which in turn causes an upper assembly 191 to rotate and/or move with respect to a lower assembly 187. In embodiments, a second motor or actuator 412 may drive a hinging assembly 190 or 414 to move with respect to a lower support assembly 187. In embodiments, an upper assembly 191 may move up to 90 degrees (or at a right angle) with respect to a lower assembly 187. In embodiments, a second motor or actuator 412 may comprise a pneumatic motor. In other embodiments, a second motor or actuator 412 may comprise a servo motor and/or a stepper motor. In embodiments, an encoder and/or other feedback assembly may be utilized for feedback of position information to a second motor controller or driver 410.

[0086] In embodiments, such as shading objects and/or intelligent umbrellas illustrated in Figures 1 A and 1 B, an upper assembly 1 12 of a center support assembly 107 may further comprise and/or house an arm support assembly 108. In embodiments, arms and/or blades 109 may be connected, coupled and/or attached to a center support assembly 107. In embodiments, such as

embodiments of an intelligent shading charging system illustrated in Fig. 1 D, an upper support assembly 191 may further comprise and/or house an arm support assembly 193 and in some cases arms/blades 194.

[0087] Figure 5A illustrates a block diagram of shading object actuator or deployment assembly according to embodiments. In embodiments, an upper assembly 1 12 of a center support assembly 107 may further comprise a third motor assembly, an arm driving assembly, an arm support assembly 108 and/or arms/blades 109. In embodiments, a third motor assembly may comprise a third motor controller 605, a third motor 610, and an arm driving assembly 615. Figure 6A illustrates a shading object with arm support assemblies 108 and arms/blades in an open position 171 and a closed position 172. Figure 6B illustrates an intelligent shading charging system with arm support assemblies 193 and arms/blades 194 in an open position 171 and a closed position 172.

[0088] In embodiments, some of these assemblies may be housed in a lower assembly 1 13. In embodiments, a third motor controller 605, a third motor 610, an arm supporting assembly 108 and/or arm/blade 109 may be housed and/or positioned in an upper assembly 1 12, which may be positioned above a lower assembly 1 13 of a center support assembly 107. In embodiments, a third motor 610 may comprise a stator, rotor and shaft / spindle. In embodiments, a shaft / spindle may be coupled and/or connected to a rod (e.g. , a threaded rod). In embodiments, an arm driving assembly 615 may comprise at least a threaded rod and a collar. In embodiments, a threaded rod may be coupled and/or connected to a collar, and a collar may have a number of slots to which linked arm supports (e.g. , arm support assembly 108) are linked, coupled or connected. In embodiments, a linear actuator may be coupled in between a third motor controller 605 and arm supporting assembly 108.

[0089] Figure 5B illustrates a block diagram of shading object actuator or deployment assembly according to embodiments. In embodiments, such as illustrated in Fig. 1 C, an upper support assembly 191 of an intelligent shading charging system may further comprise a third motor assembly, one or more arm support assemblies 193 and/or one or more arms/blades 194. In embodiments, a third motor assembly may comprise a third motor controller 605, a third motor 610, and an arm driving assembly 615. In embodiments, such as illustrated in Fig. 1 C, an upper support assembly 191 of an intelligent shading charging system may further comprise a third motor assembly, one or more arm support assemblies 193 and/or one or more arms/blades 194. . In alternative embodiments, some of these assemblies may be housed in a lower support assembly 187. In embodiments, a third motor controller 605, a third motor 610, one or more arm supporting

assemblies 193 and/or one or more arm/blades 194 may be housed and/or positioned in an upper support assembly 191 , which may be positioned above a lower assembly 187 of an intelligent shading charging system. In embodiments, a third motor 610 may comprise a stator, rotor and shaft / spindle. In embodiments, a shaft / spindle may be coupled and/or connected to a rod (e.g. , a threaded rod). In embodiments, an arm driving assembly 615 may comprise at least a threaded rod and a collar. In embodiments, a threaded rod may be coupled and/or connected to a collar, and a collar may have a number of slots to which linked arm supports (e.g. , arm support assembly 193) are linked, coupled or connected. In embodiments, a linear actuator may be coupled in between a third motor controller 605 and arm supporting assembly 193.

[0090] Figure 7 illustrates assemblies to deploy arms and/or blades according to embodiments. In embodiments, arm supports and/or arm supporting assemblies 108 may be coupled to arms, blades or other similar devices. In embodiments, arms, blades, and/or other similar devices 109 may be detachably attached to an arm driving assembly. A shading fabric may be detachably attached to arms/blade 109 and/or arm supporting assembly 108. In an embodiment illustrated in Fig. 7, a shading object and/or umbrella may comprise an upper assembly 1 12 of a center support assembly 107, one or more arm support assemblies 108, one or more arms/blades 109, on or more solar panels (not shown), and a shading fabric cover 705. In embodiments, a shading fabric cover 705 may include fabric fused with flexible solar cells. In embodiments, arm support assemblies 108 (or articulating blades) may be in a rest position inside an upper assembly 1 12 of a center support assembly 108 (e.g. , a recess or a recessed channel in an upper assembly 1 12). As is illustrated in Fig. 7, arm support assemblies 108 may be connected to a top end 710 (or upper section) of an upper assembly 1 12 of a center support assembly 107. In embodiments, arms/blades 109 may be attached, adhered, fastened, and/or connected to arm support assemblies 108 (e.g. , articulation blades). As illustrated in Fig. 7, arm support assemblies 108 and/or arms/blades 109 may have holes or openings and a fastener or connector may be used to attach and/or fasten the arm support assemblies 108 to the arms / blades 109. In embodiments, arm support assemblies 108 and/or arms/blades 109 made be adhered together, fastened together, welded together, or may be snapped together to fit. In embodiments, a fabric cover 715 may be connected, adhered, and/or fastened to arms / blades 109. In embodiments, a fabric cover 715 may include integrated flexible solar panels. In embodiments, integrated flexible solar panels may be placed in a weather proof housing 735 in a fabric cover 715 and/or in a weather proof housing in arms / blades 109. In embodiments, when arm support assemblies are deployed to an open position, the arm support assemblies 108 may move in a direction as shown by reference number 720 and 725. In other words, arm support assemblies 108 (and thus the attached arms / blades 109) move at an angle with respect to a vertical axis coming out of a center support assembly 107. In embodiments, as illustrated in Fig. 7, a vertical axis is represented by reference number 730.

[0091] Following discussions apply with equal force to embodiments illustrated in Figs. 6A (shading objects and/or intelligent umbrellas) and Fig. 6B (intelligent shading charging system). Reference numbers for both embodiments may be disclosed below. In embodiments, a third motor controller (or motor driver) 605 may receive commands, signals, and/or instructions from a shading object movement control PCB 895 (and/or a shading object computing device 860) to control operation of a third motor 610. In embodiments, a third motor controller 605 may generate commands and/or signals causing a third motor 610 to drive its outside shaft and engage a gearbox assembly 615 (or linear actuator or gearing assembly), which in turn causes movement of a linked arm support assemblies 108 (Figs. 1 A, 1 B, 6A) and arm support assemblies 193 (Figs. 1 C and 6B). In embodiments, a linear actuator may incorporate a motor. In embodiments, a potentiometer may act as a control device to limit how much arm support

assemblies 108 (or 193) deploy or move away from an upper assembly 1 12 of a center support assembly 107 (Figs. 1 A and 1 B) or (arm support assemblies 193 move away from an upper support assembly 191 ). In embodiments, for example, a potentiometer may feedback how far and/or much arm support assemblies 108 (or arm support assemblies 193) have been deployed and/or may stop deployment or movement when a predefined distance is reached (or when a full deployment distance is reached). In embodiments, a third motor 610 may comprise a pneumatic motor. In other embodiments, a third motor 610 may comprise a servo motor and/or a stepper motor.

[0092] In embodiments, a rotation of a shaft/spindle of a third motor 610 may be transmitted to a threaded rod. In embodiments, a rotation may be transmitted accordingly to a gearing ratio in order to reduce speed transferred to a threaded rod. In embodiments, a gearbox assembly (or linear actuator) may be located between an output shaft of a third motor 610 and may establish a gearing ratio in order to generate necessary torque. In embodiments, a threaded rod is rotated about its own axis and is coupled to a collar via a nut or other fastener. In embodiments, a threaded rod's rotation may cause a collar (or vertical movement assembly) to rotate and/or move in a vertical direction (e.g. , in an up or down direction). A movement of a collar in a vertical direction may cause one or more linked arm support assemblies 108 (Figs. 1 A and 1 B) or 193 (Fig. 1 C) to be moved outward from a shaft of a center support assembly 107 (Figs. 1 A and 1 B) or upper support assembly 191 (Fig. 1 C) in order to deploy one or more linked arm support assemblies 108 or 193 in an open position (extend outwards and away from an upper assembly 1 12 or upper support assembly 191 ). Once one or more linked arm supports are extended and/or deployed, a shading fabric may be attached to arms/blades 109 (Figs. 1 A and 1 B) and 194 (Figs. 1 C) and/or linked arm supports 108 or 193 of the shading system (or the shading charging system. In

embodiments, a shading fabric 715 (Figs. 1 A and 1 B) or 195 (Fig. 1 C) may be connected to arms/blades 109 or 194 and/or linked arm supports (support assemblies 108 - Figs. 1 A and 1 B and 193 (Fig. 1 C)) before linked arm supports are deployed and/or extended. In this illustrative embodiment, deployment of one or more linked arm assemblies 108 or 193 results in stretching of a shading fabric 715 or 195 between one or more arms/blades 109 or 194 and/or linked arm support assemblies 108 or 193. In embodiments, a shading object and/or a shading charging system may be ready for use to provide shade for a user in an outdoor environment. [0093] Again, the following descriptions apply with equal force to intelligent shading objects and/or umbrellas (Figs. 1 A and 1 B) and intelligent shading charging systems (Figs. 1 C). Similarly, if a shading object is to be placed in a rest or closed position, in embodiments, a third motor 610 output shaft may be rotated in an opposite direction. In embodiments, a threaded rod may be rotated in an opposite direction (e.g. , counterclockwise), and a collar (or vertical movement assembly), which may be coupled and/or connected to the threaded rod, may move in a downward direction. One or more linked arm support assemblies 108 (Figs. 1 A and 1 B) or 193 (Figs. 1 C), which are linked, coupled, and/or connected to the collar on one end and blades/arms 109 (Figs. 1 A and 1 B) and 194 (Fig. 1 C) on another end, will drop and/or move from an extended position to an at rest position. In embodiments, a shading object or umbrella may have a plurality of vertical movement assemblies (e.g. , collars) linked separately to a plurality of arm support assembles. In embodiments, linked arms supports may not need to be extended and/or moved together. In embodiments, a separate motor may be connected to one or more linear actuators, which in turn may be linked to a corresponding arm support assembly, so that each of corresponding arm support assemblies may be moved independently.

[0094] In embodiments, the linked arm supports (or support assemblies) 108 (Figs. 1 A and 1 B) or 193 (Fig. 1 C) may come to rest inside a center support assembly 107 or a shading object or upper support assembly 191 of an intelligent shading charging system. In other words, the linked arm supports or linked arm support assemblies 108 (Figs. 1 A and 1 B) or 193 (Fig. 1 C) may contract or come to rest into and/or inside channels of a center support assembly 107 of the shading object or a shading object or upper support assembly 191 of an intelligent shading charging system. In embodiments, one or more linked arm support assemblies 108 or 193 may rest or be housed in channels of an upper assembly 1 12 of a center support assembly 107 (or upper support assembly 191 ), which provides for convenient storage of arm support assemblies. This is more aesthetically pleasing and/or allows for easier movement and/or carrying of the shading object, intelligent umbrella, and/or intelligent shading charging system. In embodiments, a shading object, intelligent umbrella, and/or intelligent shading charging system may comprise between one and 20 linked arm support assemblies 107 and/or arms/blades 108 (or an arm support assemblies 193 and/or arms/blades 194). [0095] Again, the following descriptions apply with equal force to intelligent shading objects and/or umbrellas (Figs. 1 A and 1 B) and intelligent shading charging systems (Figs. 1 C). In embodiments, a shading object central support may comprise one or more shading object arm support assemblies 108 (Figs. 1 A and 1 B) or 193 (Fig. 1 C) that are deployed outward from a shading object central support 107 or an upper support assembly 191 (Fig. 1 C). In embodiments, shading object blades/arms 109 (Figs. 1 A and 1 B) or 194 (Fig. 1 C) may be attached and/or connected, respectively, to a shading object central support arm assembly 107 or upper support assembly 191 . In embodiments, shading object arms/blades 109 or 194 may be detachably connected to a shading object arm support assembly 108 193. In embodiments, a detachable coupling may occur after shading object arms support assemblies 108 or 193 have been opened and/or deployed. In embodiments, shading object arms or blades 109 or 194 may be coupled and/or connected to shading arm support assemblies 108 or 193 and rest inside a shading object central support assembly 107 or 191 until deployment.

[0096] In embodiments, shading object arms/blades may comprise any shape (e.g. , rectangular, triangular, circular). In embodiments, shading object

arms/blades may have openings in a shape (e.g. , rectangle and/or triangle) rather than being a solid price of material. In embodiments, shading arms/blades may form a frame onto which a shading object is attached. In embodiments, a shading object central support may comprise between 1 - 10 shading object arm supports and 1 - 20 shading object blades and/or arms). In embodiments, one or more shading object arms may comprise fasteners and/or connectors. In embodiments, a shading fabric may be connected and/or attached to shading arm connectors and/or fasteners. In embodiments, a shading fabric 715 (Figs. 1 A or 1 B) or 195 (Fig. 1 C) may be connected and/or attached to shading arms after deployment of shading arm supports. In embodiments, a shading fabric 715 or 195 may be connected and/or attached to shading arms before shading arms are connected to the shading arm support assemblies. [0097] Figures 8A and 8B illustrate a block diagram of hardware and/or combined hardware/software assemblies in a shading object according to embodiments. In embodiments, a shading object or umbrella may not have one, more than one, or all of listed shading object components and/or assemblies. In embodiments, as is illustrated in Figures 8A and 8B, a shading object, intelligent umbrella, and/or intelligent shading charging system may comprise a telemetry printed circuit board (PCB) comprising a processor 805, a weather variable PCB comprising a processor 810, a voice recognition PCB and/or engine 815, a rechargeable battery 820, and one or more solar panels and/or solar panel arrays 825. In embodiments, a shading object, umbrella and/or shading charging system may comprise a power tracking solar charger 830, a power input or power source (e.g. , AC adapter assembly) 835, a lighting assembly 870, an audio system 875 and/or a shading object computing device 860. In embodiments, a shading object, umbrella and/or shading charging system may include an obstacle detection module 855, a motion sensor 845, a proximity sensor 840, a tilt sensor 855, a personal area network communications module or transceiver 865, a first motor controller and motor (azimuth motor and controller) 880, a second motor controller and motor (elevation motor and controller) 885, and a third motor controller and motor (an actuator motor and controller) 890. In embodiments, a weather variable PCB 810 may be coupled and/or connected to one or more air quality sensors 81 1 , UV radiation sensors 812, a digital barometer sensor 813, a temperature sensor 814, a humidity sensor 816, and/or a wind speed sensor 817. In embodiments, a wind sensor 817 may be a thermistor. In embodiments, a telemetry PCB 805 may be coupled and/or connected to a GPS/GNSS sensor 807 and/or a digital compass 808. Although at times a shading object, intelligent umbrella and/or shading charging system may singularly be mentioned, the disclosure herein may be implemented in any of the above-mentioned devices and/or apparatus. [0098] In embodiments, a shading object, intelligent umbrella and/or shading charging system may comprise a telemetry printed circuit board (PCB) comprising a processor 805 and the telemetry PCB may provide potential location and

orientation information. In embodiments, a weather variable PCB comprising a processor 810 may provide sensor weather variables surrounding a shading object and/or umbrella. In embodiments, a wind sensor 817 may detect a high wind conditions, generate signals, and an umbrella movement control PCB 895 may generate signals and/or commands causing arm support assemblies to close or move to a closed position. In embodiments, a voice recognition engine or module 815 may enable control of a shading object via voice commands and/or a microphone. In embodiments, a voice recognition engine or module 815 may generate voice responses and/or audible warnings. In embodiments, a

rechargeable battery 820 may be charged or powered by an AC adapter, a DC adapter, and/or an array of solar cells 825, which provide power (e.g. , current and/or voltage) to a power tracking solar charger 830 and other assemblies and/or components. In embodiments, a power tracking solar charger 830 may regulate and balance a charging process. In addition, a power tracking solar charger 830 may provide data and/or information regarding a charging state. In embodiments, an AC adapter 835 and/or a DC adapter may plug into a power source (e.g. , a wall outlet and/or a generator).

[0099] In embodiments, a proximity sensor 840 may identify location of a person relative to moving components of a shading object or umbrella. In embodiments, a motion sensor 845 may detect and/or identify a presence of an individual in an area around a shading object or umbrella. In embodiments, an obstacle detector 850 may detect presence of a person and/or object in a shading object's path of travel. In embodiments, a tilt sensor 855 may detect movement and/or relocation of a shading object and/or umbrella with respect to a correct position. In embodiments, a personal area network (PAN) (e.g. , Bluetooth) module 865 and/or transceiver may provide short distance communication for application based control, audio sound transmission and/or data processing and/or retrieval. In embodiments, a lighting assembly 870 may provide and/or project light for a shading object and/or an area around a shading object. In embodiments, an audio system 875 may provide or generate audio playback through a mobile application of WiFi stream or through a PAN (e.g., Bluetooth) transceiver 865.

[00100] In embodiments, a shading object, umbrella and/or shading charging system may comprise one or more printed circuit boards. Although a description may reference a specific printed circuit board, many of features or functions of a shading object or umbrella may be implemented utilizing components mounted on a single, two or three circuit boards. In addition, one or more components may be mounted on printed circuit boards, which results in a large number of circuit boards within a shading object or umbrella. In other words, a number of circuit boards may be utilized to provide features and/or functions of a shading object and/or umbrella although embodiments described herein may only describe a specific number.

Although the term "circuit board" or "printed circuit board" is utilized, any electronic device allowing installation on and communicate with components may be utilized along with circuit board. As used in this specification, the terms "printed circuit board" and "PCB" are intended to refer generally to any structure used to

mechanically support and electrically connect electronic components using conductive pathways, tracks, or signal traces etched from (e.g. , copper) sheets laminated onto a non-conductive substrate. Synonyms for printed circuit boards include printed wiring boards and etched wiring boards.

[00101] In embodiments, a shading object and/or umbrella may comprise one or more printed circuit boards. In embodiments, a shading object or umbrella 800 may comprise a movement control PCB 895, a shading object computing device or computing device PCB 860, a first motor PCB (azimuth control) 880, a second motor PCB (elevation control) 885, a third motor PCB (actuation / deployment control) 890, a telemetry PCB (location and orientation data/information collection) 805, and/or a weather variable PCB (environmental sensor data/information collection) 810. [00102] In embodiments, a telemetry PCB 805 comprises a processor, a memory, a GPS receiver and/or transceiver and/or a compass (e.g. a digital) compass). The GPS receiver and/or compass provides location and orientation information and/or measurements which may be transferred to a memory utilizing a processor. In embodiments, a telemetry PCB processes and conditions the communicated information and/or measurements. In embodiments, a telemetry PCB 805 communicates measurements and/or additional information (e.g. , in some cases, measurements are conditioned and processed and in some cases, measurements are raw data) to a shading object movement control PCB 895 which analyzes the received location and/or orientation information and measurements. [00103] In embodiments, a weather variable PCB 810 comprises a processor, a memory, an air quality sensor, a UV radiation sensor, a barometer, a temperature sensor, a humidity sensor, and/or a wind speed sensor. One or more of the listed sensors may generate environmental and/or weather measurements and/or information, which may be transferred to a memory utilizing a processor. In embodiments, a weather variable PCB 810 processes and conditions information and measurements from the one or more sensors. In embodiments, a weather variable PCB 810 communicates received environmental and/or weather sensor measurements (e.g., in some cases conditioned and processed and in some cases raw data) to a shading object movement control PCB 895 which analyzes the received location and/or orientation information and measurements.

[00104] In embodiments, a center support assembly 107 may comprise an umbrella movement control PCB 895, as well as an integrated computing device PCB 860. In embodiments, a movement control PCB 895 may also be located in a stem assembly 106 and/or a base assembly 105. In embodiments, other terms may be utilized in place of circuit board, such as printed circuit board, a flexible circuit board, and/or an integrated circuit. In embodiments, an umbrella movement control PCB 895 may consume a low amount of power and may be referred to as a low-power PCB. In embodiments, this may prove to be a benefit as compared to prior-art umbrellas which utilized a large amount of power and thus needed to have power from a power source and could not be powered by an array of solar cells providing power to a solar power charger 830. In embodiments, a solar array may provide enough provide power to power components on an umbrella movement control PCB 895. In this case, for example, components and associated activities controlled by an umbrella movement circuit PCB 895 may not consumer large amounts of power because these activities do not require continuous operation and may only receive information or measurements on a periodic basis. As an example, an intelligent shading object 800 may not be rotating and/or tilting frequently. Thus, in embodiments, therefore, sensors providing these

measurements (e.g., a tilt sensor or sunlight sensor), and a movement control PCB communicating these measurements may not need to be in an active state at all times, which results in significant power usage savings for a shading object and/or controller. In embodiments, a motion control PCB 895 may comprise a processor 896, a non-volatile memory 897, a volatile memory, and many other components described above and below. In embodiments, for example, computer-readable instructions may be fetched from a non-volatile memory 897, loaded into a volatile memory 898, and executed by a processor 896 to perform actions assigned to, controlled and/or commanded a motion control PCB 895. In embodiments, nonvolatile memory may be flash memory, ASIC, ROMs, PROMs, EEPROMs, solid state memory, CD, DVD, persistent optical storage or magnetic storage media.

[00105] In embodiments, as a further example, shading object motors, e.g. , a first motor (azimuth movement motor), a second motor (elevation movement motor), and/or a third motor (articulation or actuator movement motor) may not be utilized frequently, so there does not need to be a large amount of power utilized by these motors within a shading object. In embodiments, when motors and/or motor assemblies are operating, the motors may require 2 to 3 amps. If system is idle and for example, the shading computer is not operating, an intelligent shading object may only require 180 milliamps. If an audio system is operating, e.g. , music is playing and the amplifier and speakers are being utilized, only 400 - 500 milliamps, In addition, motor controllers may not be utilized frequently since the motor controllers may not be driving and/or sending commands, instructions, and/or signals to motors frequently. Thus, a low-power movement control PCB 895 may provide a shading object owner with power usage savings and efficiency.

[00106] In embodiments, readings and/or measurements from sensors may cause a movement control PCB 895 to transmit commands, instructions, and/or signals to either a first motor control PCB 880 (azimuth movement), a second motor control PCB 885 (elevation movement), and/or a third motor control PCB 890 (actuation movement), in order to cause specific movements of different assemblies of a shading object or umbrella. For example, in embodiments, a GPS transceiver 806 may receive GPS signals and provide GPS measurements (e.g. , values representative of a longitude, latitude, and/or an altitude reading) to a movement control PCB 895. In embodiments, a movement control PCB 895 may analyze the GPS measurements and determine that a shading object should be moved to a specific elevation. In other words, in embodiments, a movement control PCB 895 may utilize GPS generated measurements to direct a second motor assembly to move to a proper elevation. In embodiments, GPS measurements (coordinates and time) identify a proper elevation of the sun based on a geographic location. In embodiments after center support assembly 107 is moved to a position identified by GPS measurements, arm support assemblies 108 may be extend and the arms and/or blades 109 may be fully deployed. In embodiments, a movement control PCB 896 may communicate commands, instructions, and/or signals to a second motor control PCB 885 to cause an upper assembly 1 12 of a center support assembly 107 to rotate or move approximately 45 degrees in a downward direction with respect to a lower assembly 1 13 of the center support assembly. In

embodiments, a movement control PCB 895 may communicate commands, instructions, and/or signals to a third motor control PCB to fully extend arm support assemblies 108 (e.g. articulating blades/assemblies) and also arms/blades 109.

[00107] In embodiments, a digital compass 807 may generate a heading and/or orientation measurement and a telemetry PCB 805 may communicate a heading and/or orientation measurement to a movement control PCB 895. In embodiments, a movement control PCB 895 may analyze a heading measurement and generate and/or communicate commands, instructions, and/or signals to a first control PCB 880 to rotate a stem assembly 106 and a center support assembly 107 (e.g. , an umbrella support assembly) to face or move the shading object towards a light source (e.g. , a sun). In embodiments, digital compass measurements may be utilized as directional input for an azimuth (or first motor). In embodiments, a movement control PCB 895 may calculate counts and/or limits for motors to properly orient an intelligent shading object based on GPS measurements and/or digital compass measurements. Continuing with this embodiment, a movement control PCB 895 may generate and/or communicate commands, instructions, and/or signals to a third motor controller PCB 890 to cause arm support assemblies 108 to be extended or deployed along with arms/blades 109.

[00108] In embodiments, a wind speed sensor 817 may generate measurements and a variable weather PCB 810 may communicate measurements to a shading object movement control PCB 895. In embodiments, a movement control PCB 895 may analyze and/or compare communicated measurements to a threshold in order to determine if unsafe conditions are present. In embodiments, for example, if a wind speed threshold is reached or exceeded, identifying an unsafe condition, a movement control PCB 895 may communicate commands, instructions, and/or signals to move shading object assemblies to a rest position. Continuing with this illustrative example, a movement control PCB 895 may communicate commands or instructions or signals to a second movement control PCB to cause an upper assembly 1 12 to move to an original position (e.g. , at rest position), which may be where an upper assembly 1 12 is a vertical extension of a lower assembly 1 13. In embodiments, a movement control PCB 895 may communicate instructions, commands and/or signals to a third motor control PCB 890 to move arm support assemblies 108 back into an upper assembly and/or retract arm support

assemblies 108 into channels of an upper assembly 1 12. In embodiments, a movement control PCB 895 may communicate commands, instructions and/or signals to a sound reproduction system 875 and/or a display device to warn a user of unsafe wind conditions.

[00109] In embodiments, a first motor control PCB 880, a second motor control PCB 885, a third motor control PCB 890 and a movement control PCB 895 may be connected to each other via wires and/or traces and instructions may, commands and/or signals may be communicated via wires and/or traces. In embodiments, the motor control PCBs 880, 885 and 890 may communicate with a movement control PCB 895 via a personal area network communications protocol, e.g. , Bluetooth. In embodiments, a weather variable PCB 810 and/or a telemetry PCB 805 may communicate with a movement control PCB 895 via wires, traces, integrated circuits, and/or interfaces and communicate instructions, commands or signals. In embodiments, a weather variable PCB 810 and a telemetry PCB 805 may communicate with a movement control PCB 895 via personal area network protocols (utilizing a PAN transceiver - e.g. , a Bluetooth transceiver). In

embodiments, motor control PCBs 880 885 890 may communicate directly (either via wires or a wireless communication protocol) with a weather variable PCB 810 and/or a telemetry PCB 805 without utilizing a computing device 860 and/or a movement control PCB 895.

[00110] In embodiments, as described above, a shading object, intelligent umbrella and/or shading charging system may comprise a computing device PCB, which may comprise a shading object computing device 860. In embodiments, a shading object may comprise a shading object computing device 860 which is not installed and/or mounted on a computing device PCB. In embodiments, a shading object computing device 860 and/or a computing device PCB may consume a larger amount of power (with respect to movement control PCB 895) due to activities it is responsible for executing being performed more frequently and/or with a higher data throughput. In embodiments, an integrated computing device 860 may be responsible for camera control, video and/image processing, external Wi-Fi communication, e.g. , such as operating as a hot spot, as well as running various software applications associated with the intelligent shading object and/or umbrella. The computing device 860, because of operating and being responsible for more data intensive features and/or functions, may require more processing power due to extended operation and continuous data throughput. In embodiments, a computing device may be integrated into a center support assembly 107. In embodiments, a computing device may be integrated into a base assembly and/or a stem assembly.

[00111] FIGURE 9 illustrates a block diagram of a movement control PCB according to embodiments. Returning back to discussion of a movement control PCB, in embodiments, a movement control PCB 895 may comprise a

processor/controller 905, a proximity sensor 910, a motion sensor 915, a tilt sensor 920, a personal area network transceiver 930, an audio receiver 935 (optional), one or more speakers 940, and/or a memory 950 having umbrella or shading object control software (e.g. , executable instructions stored in a non-volatile memory 951 and executable by a processor 905). In embodiments, an umbrella movement control PCB 895 may comprise a USB transceiver 960. In embodiments, an umbrella movement control PCB 895 may comprise sensor interface subsystem 955 for communicating sensor measurements to an umbrella movement control PCB 895 and communicate commands and/or signals from and two to external sensors. In embodiments, a sensor interface subsystem 955 may be located, or may also be located on a telemetry PCB 805, a weather variable PCB 810, and/or first, second, or third motor control PCBs 880, 885, and 890. For example, in embodiments, an intelligent shading object and/or umbrella may also include a signal conditioning subsystem which may also be referred to as a sensor interface system and the terms may be utilized interchangeably throughout the specification. In embodiments, an intelligent shading object and/or umbrella (and the signal conditioning subsystem) may further comprise one or more reference signal modules, one or more signal conditioning modules, and one or more analog-to- digital converters. In an embodiment, one or more sensors (e.g., air quality sensor 81 1 , UV radiation sensor 812, wind speed sensor 817, motion sensor 845, tilt sensor 855) may receive communicated analog signals and may transmit analog signals to signal conditioning modules 955. In embodiments, a signal conditioning module 955 may process and/or condition communicated analog sensor signals. Although signals are described as being analog, the description herein equally applies to digital signals. In embodiments, one or more signal conditioning modules may communicate and/or transfer processed and/or conditioned signals to one or more A-to-D converters. In embodiments, one or more signal reference modules may be a non-volatile memory, or other storage device, that stores and/or retrieves signal values that the communicated signal values may be compared to in order to determine if threshold conditions may be met. In embodiments, a comparison of communicated signal values to reference signal values may allow the shading object signal conditioning system to understand if normal conditions are being experienced by an intelligent shading object and/or umbrella, or if an intelligent shading object subsystem may be experiencing abnormal conditions, (e.g., high humidity, high movement, high wind, and/or bad air quality) [00112] In embodiments, an umbrella movement control PCB 895 may comprise a proximity sensor 840. In embodiments, a proximity sensor 840 may be able to detect a presence of nearby objects, (e.g., people or other physical objects) without any physical contact between a sensor and an object. In embodiments, a proximity sensor 840 be located on and/or mounted on a movement control PCB 895. In embodiments, a proximity sensor 840 may be located on and/or mounted on other printed circuit boards or may be a standalone component in a shading object system. In embodiments, a proximity sensor 840 may be located within a center support assembly 107. In embodiments, a proximity sensor 840 may generate measurements and/or signals, which may be communicated to a

processor/controller 905 in a movement control PCB 895. In embodiments, an umbrella movement control board 905 may store communicated measurements and/or signals, which has instructions stored thereon. In embodiments, proximity sensor software instructions, which are fetched from memory 950 and executed by a processor 905, may perform and/or execute a proximity process or method. In embodiments, for example, a proximity process may comprise receiving

measurements and/or signals from a proximity sensor 840 indicating an object and/or person may be located in an area where a shading object is deployed, going to be deployed and/or extended, and/or towards where a component of a shading object may be moving. For example, if an individual is located in an area where arm support assemblies may be deployed and/or extended, a proximity sensor 840 may transmit a signal or measurement indicating an object may be an obstruction to, for example, a movement control PCB 895. In embodiments, a

processor/controller 905 in a movement control PCB may receive and/or analyze a proximity measurement and determine an object may be an obstacle. In

embodiments, a proximity signal and/or command may also identify a location of an object (e.g. , obstacle) in relation to a proximity sensor 840 and/or some reference location. In embodiments, a processor of a movement control PCB may generate and/or communicate a driving signal, command, and/or instruction that instructs a shading object not to deploy and/or open. In embodiments, for example, a processor/controller 905 in a movement control PCB 895 may communicate a signal and/or commands to a third motor controller to cause the third motor to stop moving the arm support assembly 108 due to an obstacle detection. In

embodiments, for example, a movement control PCB 895 may communicate a signal and/or commands to a second motor controller a second motor (articulating and/or elevation motor) to cause a second motor to stop moving an gearbox assembly and/or actuator and prevent an upper assembly 1 12 of a center support assembly from moving into an area where an obstacle is detected. In

embodiments, this may also work in the opposite direction, where if a proximity sensor 840 does not determine that an object is within a shading object area, then a proximity sensor signal may not be communicated to the processor/ controller 905 in a movement control PCB 895. [00113] In embodiments, an umbrella movement control PCB 895 may comprise a motion sensor 845. In embodiments, a motion sensor 845 may generate a signal and/or measurement indicating that an individual, a living organism, or an object is within an area covered by a motion sensor 845. For example, a motion sensor 845 may generate a signal if an individual and/or object is approaching a shading object and/or umbrella, is within 5 or 10 feet of an umbrella, or is moving within a shading area. In embodiments, a motion sensor 845 may be located on and/or mounted on a movement control PCB 895. In embodiments, a motion sensor 845 may be located on and/or mounted on other printed circuit boards or may be a standalone component in a shading object system. In embodiments, a motion sensor 845 may be located within a center support assembly 107. In embodiments, a motion sensor 845 may generate measurements and/or signals, which may be communicated to a processor/controller 905 in a movement control PCB 895. In embodiments, an umbrella movement control board 905 may store communicated measurements and/or signals, in a memory 950. In embodiments, motion sensor software instructions, may be fetched from memory 950 and executed by a processor 905, and may cause a processor 905 to perform and/or execute a motion detection process or method. In embodiments, for example, a motion detection process may comprise receiving measurements and/or signals from a motion sensor 845 indicating an object and/or person may be moving in an area where a shading object and/or umbrella is deployed, near where a shading object is located, and/or where a component of a shading object may be moving. In embodiments, if an individual's or object's movement is detected by a motion sensor, a controller / processor 905 may generate a signal instructing or commanding certain shading object components to be activated, deployed, and/or retracted. For example, if an individual's movement is detected during a night or darkness period, a processor may generate signals, instructions, or commands, to shading object components in reaction to the movement (e.g. , commands, instructions, and/or signals) may be transmitted to a lighting system or assembly 870 to turn on lights of a shading object lighting system; commands may be transmitted to an audio system 875 to activate and/or turn on an audio receiver and/or audio system and transmit a warning that an individual is near a shading object; and/or commands may be transmitted to a third motor controller 890 to cause a motor to open one or more of the arm support assembles 109 and open a shading object to startle an intruder. Further, in embodiments, one or more commands may be communicated to one or more cameras 857 to activate one or more cameras to capture images of an area around a shading object. In embodiments, if a motion sensor 845 detects movement away from a shading object, a motion sensor 845 may communicate commands, signals, and/or instructions to a controller / processor 905 in a movement control PCB 895, which in turn may turn off components and/or retract assemblies of a shading object.

[00114] In embodiments, an umbrella movement control PCB 895 may comprise one or more tilt sensors 855. In embodiments, a tilt sensor 855 can measure a tilting in one or more axes of a reference plane. In embodiments, for example, a tilt sensor 855 may comprise an accelerometer to measure tilt angle with reference to an earth's ground plane. In embodiments, a tilt sensor 855 may be placed on a center support assembly 107 of a shading object (either an upper assembly 1 12 and/or a lower assembly 1 13), may be placed on a fabric shading 715, or may be placed on arms/blades 109. In embodiments, a tilt sensor 855 may measure an angle of incident from a reference axis. In embodiments, a reference axis may be an axis of a shading object at rest, a deployment angle (e.g., if a shading object is deployed at a 45 degree angle and thus the shading cover/object is deployed at approximately 45 degrees also, a tilt sensor 855 may be configured to determine it a title sensor 855 moves past a specific angle). In embodiments, a tilt sensor 855 may generate measurements indicating an angle with reference to a ground plane. In embodiments, a tilt sensor 855 may communicate measurements to a processor / controller 905 in a movement control PCB 895. In embodiments, an umbrella movement control board 895 may store communicated measurements and/or signals, in a memory 950. In embodiments, tilt sensor software instructions, may be fetched from memory 950 and executed by a processor 905, and may cause a processor 905 to perform and/or execute a tilt detection process or method. In embodiments, for example, a tilt detection process may comprise receiving tilt sensor measurements and compare the communicated tilt sensor measurements to reference measurements. If received tilt sensor measurements are greater than a reference measurement threshold, then a controller/processor 905 may generate a signal, command or instruction and communicate the signal, command or instruction to an assembly and/or component to adjust an orientation and/or deployment of shading object. For example, if a tilt sensor 855 indicates that a shading fabric is deployed at a 50 degree angle from a reference axis, but that a maximum shading fabric deployment is 45 degrees from a reference axis, then a controller / processor 905 may generate an adjustment signal, command and/or instruction to cause an elevation and/or second motor or a actuator and/or third motor to move a shading object to a correct position.

[00115] In embodiments, an umbrella movement control PCB 895 may comprise an audio / video transceiver 865, a stereo amplifier 875, and/or one or more sound reproducers (e.g., speakers) 875. In embodiments, an audio / video transceiver 865 may be a Bluetooth Audio MP3 transceiver. In embodiments, an audio / video transceiver 875 may receive wirelessly transmitted audio and/or video signals from an audio source such as a portable electronic device, a cellular phone, an iPod, an audio player, and/or a personal computer. In embodiments, a wireless

transmission protocol may be a Bluetooth protocol for transmitting audio/video, although other protocols may be utilized. In embodiments, a portable electronic device may establish a communications channel by pairing with an audio / video transceiver, e.g., utilizing a Bluetooth protocol, in a shading object. In

embodiments, an audio/video transceiver 865 may be located in a central support assembly 107. In embodiments, an audio/video transceiver 865 may be placed on or mounted on a motion control PCB 895 although it may be placed anywhere within a shading object and/or as a separate unit. In embodiments, a shading object computing device 860 may comprise an audio/video transceiver 865. In embodiments, an audio/video transceiver 865 may be located on a shading object computing device 860 because this activity and/or feature requires more energy and/or use than other components located on a movement control PCB. In embodiments, where users may be streaming audio/video for a long period of time, locating an audio/video transceiver on a computing device 860 may be more efficient. In embodiments, an audio / video transceiver 865 may communicate a received audio signal to one or more speakers 875 for reproduction of sound. In embodiments, an audio / video transceiver 865 may communicate a received audio signal to a stereo amplifier 875 and an amplified audio signal may be transmitted to one or more speakers 875 for reproduction of sound. [00116] FIGURE 1 1 illustrates a lighting subsystem according to embodiments. In embodiments, a shading object may comprising a lighting subsystem 870.

FIGURE 1 1 A illustrates a lighting subsystem according to embodiments. In embodiments, a shading object may comprising a lighting subsystem 870. In embodiments, a shading object central support assembly 107 comprises a lighting subsystem 870 installed therein and/or thereon. In embodiments, a solar panel charging system 830 may transfer and/or provide power (e.g. , voltage and/or current) to a lighting subsystem 870. In embodiments, a rechargeable battery 820 or 1 135 may provide power (e.g. , voltage and/or current) to a lighting subsystem 870. In embodiments, a plurality of LED lights 1 1 10 may be integrated into a one or more of shading arm support assemblies 108. In embodiments, one or more LED lights 1 1 10 may be activated and/or turned on via a manual switch. In embodiments, a shading object motion control PCB 895 may communicate commands, instructions, and/or signals to a lighting subsystem 870 based on different environmental conditions and/or situations (e.g. , it is a certain time of a day, there is no light due to cloud cover and/or darkness, or a shading object detects movement in an area). In embodiments, LED lights 1 1 10 integrated into or attached to shading object arm support assemblies and may direct light in a downward manner directly towards or at a certain angle to a ground surface. In embodiments, a lighting subsystem 870 (e.g. , one or more LED lights) may be integrated into or installed within a shading object central support assembly 107. In embodiments, a lighting system may include a lighting ballast (or lighting driver) 1 1 15 and one or more LED lights 1 1 10. In embodiments, when LED lights 1 1 10 are turned on (and/or activated), one or more LED lights 1 1 10 may direct light beams outward (e.g. , in a horizontal direction) from a central support assembly 107. In embodiments, one or more LED lights may be directed at a 90 degree angle from a center support assembly 107 vertical axis. In embodiments, LED lights may be installed in a swiveling assembly 1 120 and LED lights 1 1 10 may transmit light (or light beams) at an angle of 20 to 90 degrees from a shading object central support 107 vertical axis. In embodiments, a lighting subsystem 870 may comprise a processor or microcontroller 1 125, a lighting driver 1 1 15, a memory 1 130, and/or one or more LED lights 1 1 10. In embodiments, a processor or microcontroller 1 125, a lighting driver 1 1 15, and/or a memory 1 130 may be installed on lighting PCB. In embodiments, a shading object motion control PCB 895 may transmit commands, instructions and/or signals to a processor or microcontroller 1 125 to activate or deactivate one or more LED lights 1 1 10. In embodiments, a

microcontroller or processor 1 125 may communicate pulses of voltage and/or current to a lighting driver 1 1 15, which in turn communicates pulses of voltage and/or current to one or more LED lights 1 1 10. In embodiments, communicating pulses of voltage and/or current to a lighting driver 1 1 15 and/or one or more LED lights 1 1 10, saves a shading object power, due to only providing power a smaller amount of time. This method allows a shading object to conserve a large amount of energy when an object is used in a dark environment (e.g. , night time). Between 20 and 70 percent of power may be saved utilizing this method. In embodiments, one or more LED lights 1 1 10 may be mounted on or installed within a center support assembly 107. In embodiments, one or more LED lights 1 1 10 may be directed to shine lines in an upward direction (e.g. , more vertical direction) towards arms / blades 109 and/or a shading fabric of a shading object. In embodiments, a bottom surface of a shading fabric may reflect light beams from one or more LED lights 1 1 10 back to a surrounding area of a shading object (e.g. , shading object). In an embodiment, a shading fabric may have a reflective bottom surface to assist in reflecting light from the LED lights 1 1 10 back to the shading area. In

embodiments, a shading object central support assembly 107 may also comprise a cooler assembly (not shown). In embodiments, a cooler assembly may be attached to a central support assembly 107. In embodiments, a can and/or a bottle may be placed in a cooler assembly and a cooler may maintain a temperature of a beverage in a can and/or bottle (e.g. , either a hot temperature or a cold

temperature). In embodiments, a cooler assembly may use magnetic principles to cool and/or heat a beverage. In embodiments, an AC adapter 835 or a DC-to-DC converter 1295 may be coupled to a cooler assembly and may cool and/or provide power to a cooler assembly to cause a cooler assembly to heat a beverage in a can and/or bottle. In embodiments, a rechargeable battery 820 may transfer and/or provide power (e.g. , voltage and/or current) to a cooler assembly. [00117] Figure 1 1 also illustrates one or more first lighting assemblies 199 and/or a second lighting subsystem 198 according to embodiments. In embodiments, one or more first lighting assemblies 199 and/or a second lighting subsystem 198 may comprise a processor or microcontroller 1 125, a lighting driver 1 1 15, a memory 1 130, and/or one or more LED lights 1 1 10 (or other lighting elements).

[00118] FIGURE 1 1 B illustrates a wireless charging assembly according to embodiments. Figure 1 1 B also illustrates one or more first lighting assemblies 199 and/or a second lighting subsystem 198 according to embodiments. In

embodiments, one or more first lighting assemblies 199 and/or a second lighting subsystem 198 may comprise a processor or microcontroller 1 125, a lighting driver 1 1 15, a memory 1 130, and/or one or more LED lights 1 1 10 (or other lighting elements). In embodiments, a processor or microcontroller 1 125, a lighting driver 1 1 15, and/or a memory 1 130 may be installed on lighting PCB or other similar structure.

[00119] In embodiments, a motion controller, processor, and/or microcontroller 895 may communicate and/or transmit commands, instructions and/or signals to a processor or microcontroller 1 125 to activate or deactivate one or more LED lights 1 1 10 (or other lighting elements). In embodiments, a microcontroller or processor 1 125 may communicate and/or transmit pulses of voltage and/or current to a lighting driver 1 1 15, which in turn communicates pulses of voltage and/or current to one or more LED lights 1 1 10.

[00120] In embodiments, communicating pulses of voltage and/or current to a lighting driver 1 1 15 and/or one or more LED lights 1 1 10, saves a shading object, intelligent umbrella, or intelligent shading charging system power, due to only providing power a smaller amount of time. This method allows a shading object, intelligent umbrella, or intelligent shading charging system to conserve a large amount of energy when an object is used in a dark environment (e.g. , night time). In embodiments, between 10 and 80 percent of power may be saved utilizing this method.

[00121] In embodiments, a solar panel charging system 830 and/or solar panel arrays 196 may transfer and/or provide power (e.g. , voltage and/or current) to one or more first lighting assemblies 199 and/or a second lighting subsystem 198. In embodiments, a rechargeable battery in a battery charging assembly 184 may provide power (e.g. , voltage and/or current) to one or more first lighting assemblies 199 and/or a second lighting subsystem 198. In embodiments, one or more first lighting assemblies 199 and/or a second lighting subsystem 198 may be activated and/or turned on via a manual switch.

[00122] In embodiments, a controller, processor, and/or microcontroller may communicate commands, instructions, and/or signals to one or more first lighting assemblies 199 and/or a second lighting subsystem 198 based on different environmental conditions and/or situations (e.g. , it is a certain time of a day, there is no light due to cloud cover and/or darkness, or movement is detected in an area surrounding an intelligent shading charging system).

[00123] In embodiments, an umbrella movement control PCB 895 may comprise a USB transceiver 877. In embodiments, a USB transceiver 877 may be a chip that implements hardware portions of a USB protocol for a device, e.g. , a shading object. In embodiments, a USB transceiver may not be utilized because USB ports and/or functionality may be built into and/or integrated into a

microcontroller/processor 905 of a movement control PCB 895. Continuing with this illustrative embodiment, a tight coupling between firmware in a processor / controller 905 and a USB peripheral allows for higher speed, easier interactions, and thus no need for an external bus. In embodiments, a USB transceiver 877 may comprise an integrated USB port 878. In embodiments, a USB transceiver 877 may be coupled and/or connected to a USB port 878 via wires, traces, and/or other conductive paths. In embodiments, a USB port 878 may utilized as a power source charger (e.g. , a Universal Serial Bus (USB) charger) for portable electronic devices. In embodiments, a power source charger (e.g. , USB port 878) may be connected and thus receive power from a shading object power charging system. In

embodiments, a shading object charging system may receive power from a renewable power charger (e.g. , a solar power charger). In embodiments, a shelf and/or container may be attached to a shading object, so a portable electronic device may be placed on a shelf or within a container, while charging. In

embodiments, a USB port 878 may be located in a shelf and/or container and accessible to be utilized by a portable electronics device for charging. [00124] In embodiments, a shading object may comprise a shading fabric 715 and solar cells 1 10. In embodiments, one or more strips or arrays of solar cells 1 1 0 may be placed on a top surface of a shading fabric 715. In embodiments, one or more strips of solar cells may be integrated into (or woven into or be a part of) a shading fabric 715. In embodiments, solar cells 1 10 may be comprised of a flexible material. In embodiments, a shading fabric 715 may be coupled, connected and/or attached to a frame or support assembly (e.g. , arm support assemblies 108) and one or more strips of solar cells 1 10 may be placed on and/or attached to a top surface of an arm support assembly 108. In embodiments, arm support assemblies 108 may comprise a recess or a channel where solar cells 1 10 may be placed and/or inserted. In embodiments, a shading fabric 715 may have one or more strips of solar cells 1 10 integrated therein or woven into. In embodiments, one or more strips of solar cells 1 10 may form a pattern on a top surface of a shading object. Continuing with this illustrative embodiment, by forming solar cells in specific patterns, one or more strips of solar cells 1 10 may be capture sunlight and/or other ultraviolet light from a number of directions and/or thus, intensities.

[00125] FIGURE 12 illustrates a power subsystem according to embodiments. In embodiments, a shading object may comprise a power tracking solar charger 830. In embodiments, a center support assembly 107 of a shading object may comprise and/or house a power tracking solar charger 830. Continuing with this illustrative embodiment, a power tracking solar charger 830 may be located in and/or on an upper assembly 1 12 of a center support assembly 107, or alternatively in or on a bottom assembly 1 14 of a center support assembly 107. In embodiments, a power tracking solar charger 830 may be connected to one or more solar cells 1210, a rechargeable battery 820, and/or an AC adapter 835 or 1220. In embodiments, a photovoltaic (PV) cell, or "solar cell" may be a smallest semiconductor element that converts sunlight into electricity. In embodiments, a semiconductor silicon may be treated so that silicon generates a flow of electricity when a light shines on it. In embodiments, a PV array or cells may be an interconnected system of PV cells that may function as a single electricity-producing unit. In embodiments, a PV array

1210 or 1 10 may comprise one of more of the strips of solar cells. In embodiments, a PV array 1210 or 1 10 may comprise one solar cell strip. In embodiments, one or more solar cells 1210 (e.g. , a PV array 1210) may provide power directly to a power tracking solar charger 830 and/or a rechargeable battery 820. In embodiments, one or more solar cells 1210 (or solar arrays) may provide power to motor assemblies, components, printed circuit boards, and/or other assemblies 1297 in an intelligent shading object.

[00126] In embodiments, a power tracking solar charger 830 may be coupled and/or connected to a rechargeable battery 820. In embodiments, a power tracking solar charger 830 may be coupled and/or connected to an AC adapter 835 (or DC power adapter), which is coupled and/or connected to a power source. In embodiments, a charging assembly 830 may be coupled to one or more solar cells 1210 or solar arrays. In embodiments, a power tracking solar charger 830 may include a control panel 1275, a controller 1280, a non-volatile memory 1285 and a volatile memory 1290, the non-volatile memory 1285 comprising computer-readable and computer-executable instructions, which are fetched and loaded into volatile memory 1290 for execution by a controller or processor 1280 to perform a power monitoring, tracking and distribution process. In embodiments, a power monitoring, tracking and/or distribution process may monitor power levels and/or power conditions of different components of a shading object (e.g. , a motion control PC B 895, arrays of solar cells 1 10 1210), a rechargeable battery 820). In embodiments, a power tracking and monitoring process may communicate information regarding power levels and/or power conditions of a solar charger 830 (and other shading object components) to a control panel 1275 and/or to a portable electronic device to display to a user and/or owner.

[00127] In embodiments, a power tracking solar charger 830 may transfer incoming power (e.g. , voltage and/or current) generated by the solar cells to one or more converters (e.g., a DC-to-DC converters) 1295. In embodiments, a

rechargeable battery 820 may provide power (e.g., voltage and/or current) to a DC- to-DC converter 1295. In embodiments, one or more DC-to-DC converters 1295 may transfer voltage and/or current to one or more PCBs, components, motor assemblies, and/or other assemblies of a shading object. In embodiments, a DC- to-DC converter 1295 may be utilized to provide lower operating voltages, e.g. , 3.3 VDC or 5.0 VDC or other voltages, to components, boards and/or assemblies 1297 operating on a lower DC voltage. In embodiments, rechargeable battery 820 may transfer incoming power (e.g. , voltage and/or current) to one or more converters 1295, and a power charger 830 may monitor power distribution and power levels. In embodiments, a rechargeable battery 820 may provide power to shading object or umbrella motor assemblies, PCBs, components, and/or assemblies 1297. If high power requirements are existing due to operating conditions (e.g. , motors running), a rechargeable battery 820 and solar cells or solar cell arrays may both provide power to one or more PCBs, components, motor assemblies, and/or other assemblies of a shading object. [00128] In embodiments, a shading object may comprise a voice recognition engine 815. In embodiments, a shading object motion control PCB 895 may have a voice recognition engine 815 mounted and/or located thereon. In embodiments, an integrated computing device 860 may have a voice recognition engine mounted on and/or located thereon. In embodiments, a voice recognition engine 815 may be mounted separately from a motion control PCB 895 and/or a computing device 860. In embodiments, a voice recognition PCB may comprise a processor, nonvolatile and/or volatile memory components, and a voice recognition engine. In embodiments, a voice recognition engine or module 815 allows users and/or individuals to speak to a shading object to communicate with a shading object and/or external devices (e.g., portable electronic devices and/or cameras) coupled to a shading object or umbrella. In addition, a voice recognition engine 815 may synthesize speech from received commands (e.g. , text commands) and may talk to an individual. In embodiments, computer-readable and computer-executable instructions may be fetched from a non-volatile memory, stored in a volatile memory, and executed by a processor to recognize an individuals' voice and/or implement a voice recognition process. In embodiments, a voice recognition engine 815 may recognized preprogramed voice commands. In embodiments, a voice recognition engine 815 may allow training and/or incorporating of new commands. In embodiments, if commands are successfully translated and/or converted, a voice recognition engine 815 may communicate a conformation audio signal to an audio amplifier 875 and/or one or more speakers 940. In

embodiments, commands may be open, close, deploy, retract, rotate, move, turn on lights, turn on music, activate solar power, etc. In embodiments, a voice recognition engine 815 may also have a speech synthesizer. In embodiments, a voice synthesizer in a voice recognition engine 815 may allow for an intelligent shading object or umbrella to talk to individuals within an area surrounding and/or adjacent to a shading object or umbrella. In embodiments, a voice recognition engine 815 may generate voice prompts, responses or alerts and convert these prompts, responses, and/or alerts to audio signals. In embodiment, generated audio signals may be communicated to an audio amplifier and/or one or more speakers 940. [00129] In embodiments, a shading object central support assembly 107 may also comprise one or more microphones. In embodiments, one or more

microphones may also be attached to and/or integrated into a stem assembly 106, a base assembly 105, shading fabric 715, arms/blades 109, and/or arm support assemblies 108. In embodiments, for example, a voice recognition process and/or method may be initiated when a user in a physical vicinity of a shading object may speak. In embodiments, a microphone, located on or within the shading object, may capture a user's voice and generate an analog voice signal. In embodiments, an analog-to-digital converter (ADC) may convert a voice to a digital voice signal and may transfer and/or communicate a voice digital signal to a voice recognition engine 815. In embodiments, a voice recognition engine 815 may analyze the received digital voice, extract commands and/or information, and communicate the extracted commands and/or information to a motion control PCB 895 and shading object computing device to cause actions requested verbally by a user or individual to be implemented and/or completed. In embodiments, the voice recognition engine 815 generated commands, instructions, or signals instructions may be

communicated to other PCBs, subsystems, assemblies and/or components of the shading object in order to comply with and/or react to voice instructions. For example, a voice recognition engine 815 may extract from a received voice signal, a command to obtain sensor measurements, (e.g. , sunlight intensity, ozone, and/or wind measurements or reading) from a sensor module 750. The voice recognition engine 815 may communicate the extracted command to a motion control PCB 895 (or alternatively a shading object computing device 860) to communicate with a weather variable PCB and/or a sensor module 750 (e.g. , through wireless transceivers, wires, and/or circuit traces). In embodiments, a movement control PCB 895 may communicate to a weather variables PCB to obtain sensor measurements from sensors in a sensor module 750 coupled to and/or connected to a weather variables PCB. In embodiments, sensors (e.g. , sensors 81 1 - 814, 816, 817) may obtain measurements and may communicate these measurements to a weather variables PCB 810, a shading object movement control PCB 895, and/or to a shading object computing device 860. In embodiments, obtained measurements may be stored (for later use and/or analyzation), may be

communicated via a sound system to a user, or may be displayed on a monitor, and/or utilized in a mobile software application.

[00130] In embodiments, a movement control PCB 895 or other PCB or circuit may comprise a personal area network transceiver 865. In embodiments, a PAN transceiver 865 may be located on a separate PCB or on other PCBs within a shading object. In embodiments, a PAN transceiver 865 located on a shading object movement control PCB 895 may be a master transceiver. In embodiments, which are illustrative, but not limiting, PAN transceivers, may be an INSTEON transceiver, an IrDA transceiver, a Wireless USB transceiver, a Bluetooth transceiver, a Z-Wave transceiver, a ZigBee transceiver, and/or a body area network transceiver. In embodiments, additional PCBs and/or components may also comprise PAN transceivers. In embodiments, a transceiver 865 on a movement control PCB 895 may communicate instructions, commands, and/or signals to one or more PAN transceivers located in other areas of the intelligent shading object (e.g. , PAN transceivers in a first PCB (e.g. , a PCB controlling azimuth movement), a computing device (e.g. , a Linux computer), a second PCB (e.g. , a PCB controlling elevation movement), a third PCB (e.g. , a PCB controlling extension or linear actuation movement), a telemetry PCB, and/or a weather variable PCB). By utilizing PAN transceivers and PAN communication protocols in an intelligent shading object, use of wires, flexible circuit boards, and/or other interfaces may be minimized and more physical space may be present in an intelligent shading object. This is a benefit in that a shading object may be able to house and/or incorporate more features and/or components. In addition, potential hindrances to movements of a shading object (e.g., rotation about a vertical axis of a central support assembly and/or deployment of a shading object support arms and/or blades) may be minimized allowing free movement of these assemblies. In embodiments, PAN transceivers may be utilized in all communications between PCBs and/or between PCBs and/or components of a shading object. In

embodiments, PAN transceivers may be utilized for communications of shorter durations and/or lower data throughput. In embodiments, for example,

communications from a movement control PCB 895 to a first motor controller 880 may utilize a PAN communication protocol (e.g., PAN transceivers in each device) due to short duration and/or a low data throughput. In embodiments, for example, communications from a movement control PCB 895 to a weather variable PCB 810 may utilize a PAN communication protocol.

[00131] In embodiments, wearable computing devices (e.g., watches, glasses, other clothing articles) may also incorporate PAN technology to communicate with nearby computing devices and exchange digital data and information using an electrical conductivity of a human body as a data network. In embodiments, wearable computing devices (e.g., watches, glasses, other clothing articles) may also incorporate PAN technology to communicate with nearby computing devices and exchange digital data and information using an electrical conductivity of a human body as a data network. In embodiments, a person wearing a smart watch may exchange information with an intelligent shading object by coming into contact with a shading object, especially next to a PAN transceiver. A human body's natural salinity makes it a good conductor of electricity. In embodiments, an electric field passes tiny currents, known as pico amps, through contact with another conductive device. In embodiments, a transference of data and/or commands through intra-body contact, such as handshakes, may be known as linkup. In embodiments, contact between a wearable computing device and another conductive surface (e.g., on a shading object, may complete an electric circuit and each person's data and/or control information, such as e-mail addresses and phone numbers, may be transferred to a shading object from a wearable computing device to a shading object and in another direction (e.g., from a shading object to a wearable computing device). In embodiments, a person's clothing also could act as a mechanism for transferring this data back and forth between a shading object.

[00132] In embodiments, an intelligent shading object or umbrella may further comprise a sensor module 750. In embodiments, a sensor module 750 may be connected to a top end of a center support assembly 107. As illustrated in Fig. 7, a sensor module 750 may connect, couple or fasten to a post or other structure on top of an upper assembly 1 12 of a center support assembly 107. In embodiments, a sensor module 750 may be located on other portions of a shading object, e.g. , integrated into a shading fabric 715, attached to or integrated into arms and/or blades, connected, coupled or attached to a center support assembly 107, a stem assembly 106, and/or a base assembly 106. In embodiments, a sensor module 750 may screw into a recess on a top of a center support assembly, or alternatively may snap onto a top of a center support assembly 107.

[00133] In embodiments, a sensor module 750 may comprise a telemetry PCB 705 and a weather-related PCB. A telemetry PCB 705 may also be referred to as a GPS solar tracking module. In embodiments, a telemetry PCB may comprise a GPS/GNSS sensor 706 and/or a digital compass 707. In embodiments, a telemetry PCB 705 may be powered by a rechargeable battery 820 and/or DC-to-DC converters, or by a battery located on a telemetry PCB. In embodiments, a GPS receiver 706 may communicate with GPS/GNSS satellites and receive positioning signals from satellites and calculates a latitude and/or longitude of a shading object. In embodiments, a GPS receive may receive latitude, longitude and/or altitude readings from GPS/GNSS satellites. In embodiments, a GPS receiver 706 may also determine an altitude of a shading object from signals communicated from GPS/GNSS satellites. In embodiments, GPS receiver measurements and/or calculations may be utilized by a shading object to determine movements

necessary by different electromechanical assemblies of a shading object. For example, a movement control PCB 895 may receive GPS receiver measurements (e.g. , longitude, latitude, and/or altitude measurements), analyze and/or process these measurements, and determine necessary movements by a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, a movement control PCB 895 may communicate commands, signals, and/or instructions to a first motor controller PCB 880 (azimuth), a second motor controller 885 (elevation), and/or a third motor controller PCB 890 (actuation) to cause movements of a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. [00134] In embodiments, a sensor module 805 may comprise a digital compass 707 may measure magnetic fields surrounding a shading object and may generate a directional reading and/or an angle a direction heading (e.g. , a degree heading from true north). In embodiments, these directional and/or angular readings may be communicated to a motion control PCB 895. For example, a movement control PCB may receive digital compass 807 measurements or values, analyze and/or process these measurements or values, and determine necessary movements in response to heading or directional information by a stem assembly 106, a center support assembly 107, and/or arm support assembly 1 08. In embodiments, a movement control PCB may communicate commands, signals, and/or instructions to a first motor controller PCB 880 (azimuth), a second motor controller 885

(elevation), and/or a third motor controller PCB 890 (actuation) to cause

movements of a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, a telemetry PCB may be utilized infrequently because a shading object may not be moved from one geographical location to another. Thus, GPS information (latitude, longitude, and/or altitude) and/or heading information (from a digital compass) may not change frequently. Thus, a telemetry circuit PCB 805 may comprise a low power processor. In embodiments, a telemetry PCB 805 (and a GPS receiver 806 and/or digital compass 807) may only utilized during configuration and/or calibration of a shading object. During configuration and/or calibration of a shading object (or after a shading object or umbrella has been moved), GPS and digital compass

measurements may be requested and after communication of these

measurements, a movement control PCB 895 may analyze measurements, calculate elevation and azimuth movements for an intelligent shading object, and communicate instructions, commands and/or signals to respective motor assemblies. In embodiments, a digital compass 807 may be utilized more frequently than a GPS receiver 806. [00135] In embodiments, a sensor module 750 may comprise a weather variable PCB 810. In embodiments, a weather variable PCB may be located in another assembly of a shading object (e.g. , stem assembly 106, a center support assembly 107, and/or arm support assembly 108). In embodiments, a weather variable PCB 810 may also be referred to as a micro climate data module. In embodiments, a weather variable PCB 810 may comprise a processor/controller, a memory, one or more air quality sensors 81 1 , one or more UV radiation sensors 812, one or more digital and/or analog barometers 813, one or more temperature sensors 814, one or more humidity sensors 816, and/or one more wind speed sensors 817. In embodiments, a solar power charging assembly 830 may provide power (e.g. , voltage and/or current to a weather variable PCB 805 and/or components located thereon. In embodiments, a battery (e.g. , rechargeable battery) 820 may provide power to a weather variable PCB and components located thereon.

[00136] In embodiments, sensor readings, measurements, and values

communicated by sensors to a weather variable PCB in a sensor module 750 may be communicated directly or indirectly to a movement control PCB 895 and then directly or indirectly to an integrated computing device 860. In embodiments, sensor readings, measurements, and values communicated by a sensor module 750 may be communicated directly or indirectly to an integrated computing device 860. In embodiments, sensor readings, measurements and/or values may be stored in a memory of a shading object computing device and/or a memory coupled thereto. In embodiments, a memory storing sensor reading measurements may be non-volatile and/or volatile. In embodiments, a shading object computing device 860 may communicate sensor readings to external computing devices via wireless communication protocols (e.g. , WiFi) in order to minimize usage of storage on a shading object computing device. In embodiments, external devices storing sensor information may include application servers and/or databases, cloud servers and/or databases, and other offsite storage devices. In embodiments, storing of sensor readings on either a shading object computing device and/or external computing devices allows a shading object sensor reading history to be created and/or maintained. [00137] In embodiments, a sensor module 750 may comprise an air quality sensor 81 1 . In embodiments, an air quality sensor 81 1 may provide ozone measurements, particulate matter measurements, carbon monoxide

measurements, sulfur dioxide measurements and/or nitrous oxide measurements. In embodiments, an air quality sensor 81 1 may provide allergen measurements. Ozone leads to intelligent readings to tell an individual to go inside. In

embodiments, a weather variable PCB 810 may receive measurements and/or readings from an air quality sensor 81 1 and may communicate these

measurements to a movement control PCB 895. In embodiments, a movement control PCB 895 may receive air quality sensor measurements, analyze the measurements, and cause shading object assemblies and/or components to react to air quality measurements. In embodiments, for example, if an air quality is too low, e.g. , as compared to an existing threshold, a movement control PCB 895 may communicate commands, instructions and/or signals to an audio system to alert a user of unsafe conditions. In embodiments, for example, ozone measurements received by a movement control PCB 895 from an air quality sensor may be utilized to determine an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device.

[00138] In embodiments, a sensor module 750 may comprise an ultraviolet (UV) radiation sensor 812. In embodiments, a UV radiation sensor may provide discrete radiation band measurements, including, but not limited to UVB, radiation, UVA radiation, Infrared lighting, or a combination of any and all of these radiation measurements. In embodiments, a weather variable PCB 810 may receive measurements and/or readings from a UV sensor 812 and may communicate these measurements to a movement control PCB 895. In embodiments, for example, UV radiation measurements received by a movement control PCB 895 from a UV sensor 812 may be utilized to determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. [00139] In embodiments, a sensor module 750 may comprise a digital barometer 813. In embodiments, a digital barometer may provide, measure, and/or

display complex atmospheric data more accurately and quickly than prior barometers. Many digital barometers display both current barometric readings and previous 1 -, 3-, 6-, and 12-hour readings in a bar chart format, much like a barograph. They also account for other atmospheric readings such as wind and humidity to make accurate weather forecasts. In embodiments, a weather variable PCB 810 may receive measurements and/or readings from a digital barometer 813 and may communicate these measurements to a movement control PCB 895. In embodiments, for example, a movement control PCB 895 may receive digital barometer measurements (e.g. , altitude measurements), analyze and/or process these measurements, and determine necessary movements by a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, a movement control PCB 895 may communicate commands, signals, and/or instructions to a first motor controller PCB 880 (azimuth), a second motor controller 885 (elevation), and/or a third motor controller PCB 890 (actuation) to cause movements of a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, for example, a movement control PCB 895 and/or an integrated computing device 860 may receive digital barometer measurements and generate a weather forecast for an area being served by a shading object and/or umbrella.

[00140] In embodiments, a sensor module 750 may comprise a temperature sensor 814. In embodiments, a temperature sensor 814 may generate and provide a temperature reading for a shading object environment. In embodiments, a weather variable PCB 810 may receive measurements and/or readings from a temperature sensor 814 and may communicate these measurements to a movement control PCB 895. In embodiments, for example, temperature

measurements received by a movement control PCB 895 from a temperature sensor 814 may be utilized to determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. [00141] In embodiments, a sensor module 750 may comprise a humidity sensor 816. In embodiments, a humidity sensor 816 may provide humidity measurements in an environment where a shading object is located. In embodiments, a weather variable PCB 810 may receive measurements and/or readings from a humidity sensor 816 and may communicate these measurements to a movement control PCB 895. In embodiments, for example, humidity measurements received by a movement control PCB 895 from a humidity sensor 816 may be utilized to determine and/or calculate an amount of time an individual should be outside, and this amount of time may be communicated to an individual via a sound system (communicated audibly), via a display and/or monitor, and/or wirelessly to an external computing device. In embodiments, a movement control PCB 895 may receive humidity sensor readings and/or temperature sensor readings and determine that a misting system and/or cooling system should be activated. In embodiments, a movement control PCB 895 may generate commands, instructions and/or signals and communicate the same to a misting system 1420 and/or a misting system controller to activate a misting and/or cooling system to deal with high humidity and/or high temperature environments and/or situations.

[00142] In embodiments, a sensor module 750 may comprise a wind sensor 817. In embodiments, a wind speed sensor 817 may provide wind speed and/or wind direction information at a top of a shading object or umbrella, and/or at a middle of a shading object and/or umbrella. In embodiments, a weather variable PCB 810 may receive measurements and/or readings from a wind sensor 817 and may communicate these measurements to a movement control PCB 895. In

embodiments, for example, a movement control PCB 895 may receive wind speed measurements analyze and/or process these measurements, and determine necessary movements by a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, a movement control PCB 895 may communicate commands, signals, and/or instructions to a first motor controller PCB 880 (azimuth), a second motor controller 885 (elevation), and/or a third motor controller PCB 890 (actuation) to cause movements of a stem assembly 106, a center support assembly 107, and/or arm support assembly 108. In embodiments, if a wind speed is higher than a predetermined threshold, a movement control PCB 895 may communicate commands, instructions, and/or signals to motor controllers to cause a shading object to be retracted and moved to a rest position. In embodiments, a wind sensor 817 may also be mounted on or integrated into a center support assembly 107 and utilized in a same fashion as described above. [00143] In embodiments, a shading object may comprise one or more digital cameras 857. In embodiments, one or more digital cameras 857 may comprise an optical system and/or an image generation system . In embodiments, digital cameras 857 may display images on a screen immediately after being captured. In embodiments, one or more digital cameras 857 may store and/or delete images from a memory associated with a digital camera. In embodiments, one or more digital cameras 857 may capture, record and/or moving videos with or without sound. In embodiments, digital cameras 857 may also incorporate computer- readable and computer-executable instructions which, which when retrieved from a non-volatile memory, loaded into a memory, and executed by a processor, may crop and/or stitch pictures, and/or potentially perform other image editing on captured images. In embodiments, a central support assembly 107 may comprise one or more cameras 857. In embodiments, a center support assembly 107 may comprise four cameras, with each camera installed approximately 90 degrees from another. In embodiments, digital cameras 857 may each be able to move in a canister or rotate in order to capture 360 degrees around a shading object. In embodiments, one or more cameras 857 may also be located on a top portion of a shading object (e.g. , located on and/or within a sensor module positioned on top of a center support assembly 107, located on top of an arm/blade 108, or located on a shading fabric 715). [00144] In embodiments, one or more digital cameras 857 may be utilized as a security cameras. In embodiments, for example, one or more digital cameras 857 may capture images in an environment in which a shading object is installed and/or located. For example, if a shading object is rotating around a vertical axis (e.g. , the shading system (stem assembly 106 and center support assembly 107) are rotating about a base assembly 105), a camera 857 may capture images and/or real-time video. In embodiment, one or more digital cameras may capture images and/or real-time video and may communicate images and/or video to a memory located on a shading object computing device 860 within a center support assembly 107. In embodiments, one or more digital cameras 857 may capture images and/or realtime video of an environment here a shading object or umbrella is located (up to a 360 degree picture) and may communicate images and/or video to a memory located on a motion control PCB 895. In embodiments, images or real-time video may be communicated and/or streamed to a wireless transceiver in an integrated computing device 860 and/or associated computing device PCB. In embodiments, continuing with this example, images and/or real-time video communicated to a shading object computing device may be stored in a memory (e.g., volatile and/or non-volatile memory) of a shading object computing device 860. In embodiments, continuing with this illustrative embodiments, images and/or real-time video may be communicated via a wireless transceiver and/or wireless hotspot to external computing devices (e.g., application servers, databases, network servers) or other devices on a global communications network (e.g., such as the World Wide Web and/or the Internet).

[00145] In embodiments, one or more digital cameras may capture an image of an individual within a shading area of the shading object. In embodiments, the digital camera 857 may communicate images and/or videos, with or without sound, to an integrated computing device 860 (or alternatively a motion control PCB). In embodiments, computer-readable and computer-executable instructions may be retrieved from a non-volatile memory, loaded into a volatile memory and executed by a processor to execute or initiate a facial recognition process or method. In embodiments, a computing device facial recognition process may compare a communicated image to existing images stored in a memory (of a shading object computing device) to determine if an individual whose face was captured is an authorized shading object user. If a facial recognition process determines an individual is an authorized user, a shading object computing device 860 may generate commands, instructions and/or signals and communicate the generated commands, instructions, and/or signals to a motion control PCB. In embodiments, in response, a processor on a motion control PCB 895 or a processor on an integrated computing device may retrieve personalized settings, from a memory, for an individual and cause shading object assemblies and/or components to be moved and/or activated in conformance with retrieved personal settings. For example, an individual may have an existing stored audio playlist and upon facial recognition, a processor may initiate playing of a personal playlist, utilizing an audio system and speakers, 875 if an individual's image is recognized. This may occur even if a playlist is on an external device (e.g. , a portable music player and/or portable electronic device). As another illustrative example, an individual may have specific settings stored in a memory for stem assemblies 1 06, base

assemblies 105, center support assemblies 107 (e.g. , position upper assembly 1 12 of a center support assembly 107 at an angle 20 degrees from a lower assembly 1 13 of a center support assembly), and/or arm support assemblies 108, . As an addition example, an individual may have a personalized setting, stored in a memory, that activates a misting system if a temperature reading and/or humidity reading is above a threshold, and an individual is located within a shading area. In embodiments, a motion control PCB 895 may communicate activation instructions, commands, and/or signals to a misting system to address the high humidity situation.

[00146] In embodiments, one or more of the digital cameras 857 may comprise an infrared detector. In embodiments, an infrared detector may comprise one or infrared light sources and an infrared sensor. In embodiments, an infrared detector may generate a signal indicating that an object is located within an area being monitored or viewed by an infrared detector. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, a camera 857 may be activated and begin to capture images and/or video, with or without sound, and transmit captured images and/or video, with or without sound, to a shading object computing device 860. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, a lighting assembly (e.g. , LED lights) 870 may also be activated and lights may be directed in an area surrounding an intelligent shading object and/or directly to an area where an object is detected. In embodiments, one or more cameras 857 and/or one or more lighting assemblies 870 may be activated, which results in better images and/or video of an area surrounding a shading objet being captured and/or communicated to a shading object computing device. This is yet another example of how an intelligent shading object provides additional benefits of not only capturing images of its surrounding area but also being utilized as a security device for an environment in which an intelligent shading object is located.

[00147] In embodiments, one or more cameras 857 may be thermal imaging camera. In embodiments, a thermal imaging camera comprises a special lens may focus on infrared light emitted by all objects within an area surrounding and/or adjacent to an intelligent shading object or umbrella. In embodiments, a focused light may be scanned by a phased array of infrared-detector elements. In

embodiments, one or more detector elements may generate a very detailed temperature pattern, which may be referred to as a thermogram. In embodiments, a detector array may take about one-thirtieth of a second to obtain temperature information to make a thermogram. In embodiments, information may be obtained from a plurality of points in a field of view of a detector array. In embodiments, detector elements from a thermogram may be converted and/or translated into electric impulses and electrical impulses may be sent to a signal-processing unit. In embodiments, a signal-processing unit may be a PCB with a dedicated chip that translates received information (electrical impulses) into thermal images and/or thermal video. In embodiments, a signal-processing unit may communicate thermal images and/or thermal video either to a display (e.g., a shading object display and/or a display on a computing device communicating with an intelligent shading object). In embodiments, a signal-processing unit of a thermal imaging camera 857 may communicate thermal images and/or thermal video to a shading object computing device 860 for analysis, storage and/or retransmission to external computing devices. In embodiments, a thermal image may appear as various colors depending on and/or corresponding to an intensity of an infrared image. In embodiments, a thermal imaging camera allows additional benefits of not having to activate a lighting assembly in order to capture images and/or videos of an area surrounding an intelligent shading umbrella or object. In addition, by not activating a lighting assembly 870, an intruder or moving object may not be aware that a digital camera 857 may be capturing an image or video of an area where an intruder or object is located. In embodiments, an infrared detector may activate a thermal imaging camera 857 upon detection of movement. In embodiments, a thermal imaging camera may activate on its own due to movement of an intruder and/or object, or may be periodically or continuing capturing images and/or video.

[00148] Figure 10 illustrates a shading object or umbrella integrated computing device according to embodiments. In embodiments, an integrated computing device PCB 1000 may comprise a wireless WiFi or LAN wireless transceiver 1010 (which may or may not operate as a wireless hotspot and/or router), a separate wireless hotspot device 1015, one or more audio/video transceivers 1020 (e.g. , PAN transceivers), one or more processors 1025, one or more non-volatile memories 1030 and one or more memory components 1035. In embodiments, many of the components may reside on a computing device PCB. In embodiments, a separate PCB may house or have some of the above-listed components (e.g. , WiFi transceiver 1010, wireless hotspot device 1015) mounted thereon and a shading object computing device may comprise non-volatile memory 1030 (e.g. , a flash drive, a hard drive, a removable disk drive), and a volatile memory 1035 such as RAM, and on or more processors 1025.

[00149] In embodiments, computer-readable and/or computer-executable instructions may be stored in non-volatile memory, fetched by one or more processors 1025, loaded into RAM 1035, and executed by one or more processors 1025 to perform data intensive functions, execute processes such as a healthcare process (e.g. , selecting a healthcare option from a dashboard of a mobile

application), a security process (e.g., selecting a security option from a dashboard of a mobile application), an energy process or application (e.g. , selecting an energy option from a dashboard of a mobile application), a weather application or processor (e.g. , selecting a weather option from a dashboard of a mobile

application), and/or communicating with external devices (e.g. , wireless access points, portable electronic devices, servers, networks). In embodiments, an integrated computing device 860 and/or a computing device PCB may consume more power due to higher data throughput and higher utilization time. Having a computing device integrated into an intelligent shading object or umbrella, provides a benefit, as to prior art shading objects or umbrellas, of allowing an intelligent shading object to run software applications, communicate with data intensive devices, such as cameras and/or audio system, utilize WiFi or other wireless communication transmissions, operate as a WiFi hotspot (or other wireless communication hub) and communicate with external computing devices to transfer data obtained by the intelligent shading object.

[00150] In embodiments, an integrated computing device 1000 may communicate with application servers, mobile applications servers, proxy servers, and/or other computing devices on a global communications network (e.g., the Internet). In embodiments, a shading object computing device may handle data and/or command communications between external devices and a shading object. In embodiment, an integrated computing device 860 may handle intra-shading object communications requiring more extensive processing power and/or higher data transfer rates. In embodiments, a shading object center support assembly 107 may house an integrated computing device. In embodiments, a center support assembly 107 may also house a computing device PCB to which a computing device 860 may be attached to and/or connected. [00151] In embodiments, an integrated computing device 860 may be a Linux- based computing device (e.g., Raspberry PI) although other operating systems and/or other processor types may be utilized. In embodiments, a shading object may comprise one or more transceivers to communicate with wireless access points utilizing a wireless communication protocol. In embodiments, one or more wireless transceivers may communicate voice and/or data communications to an access point, which in turn may communicate received voice and/or data

communications to a packet-switched network (e.g., a global communications network such as the Internet, an intranet, or a private network) or a circuit-switched network (such as existing telecommunications system). [00152] In embodiments, an integrated computing device may comprise a WiFi (or wireless LAN) transceiver 1010 which may also operate as a hotspot and/or personal wireless access point. In embodiments, an integrated computing device 860 may comprise a separate and/or additional wireless hotspot 1015. In embodiments, a wireless hotspot may be operate as an wireless access point providing network and/or Internet access to portable electronic devices (e.g., smartphones, music players) or other electronic devices (personal computers and/or laptops) in public locations, where other wireless access points are not located (or being utilized for different purposes). If a computing device 860 comprises a wireless hotspot 1015 (or a wireless transceiver 1010 is operating as a hotspot), wireless communication devices (e.g., laptops, tablets, smartphones) may utilize a shading object as a communications hub. This may be beneficial in remote locations where no wireless access points are located, or in locations where wireless data or voice communications have been interrupted. In addition, if a shading object computing device and thus a shading object includes a wireless hotspot, image or video streaming, face-timing, application downloads, or other data intensive functions and/or applications may execute and be completed in a shorter amount of time then when using a PAN transceiver 865.

[00153] In embodiments, an integrated computing device 860 may store and/or execute shading object or umbrella application software, which may be referred to as SMARTSHADE and/or SHADECRAFT application software. In embodiments, shading object or umbrella application software may be run and/or executed on a variety of computing devices including a computing device integrated within a shading object or umbrella. In embodiments, for example, shading object or umbrella application software may include computer-readable instructions being stored in non-volatile memories of a shading object computing device, a portable electronic device (e.g., a smart phone and/or a tablet), an application server, and/or a web application server, all which interact and communicate with each other. In embodiments, computer-readable instructions may be retrieved from memories (e.g., non-volatile memories) of these above-identified computing devices, loaded into volatile memories and executed by processors in the shading object computing device, portable electronic device, application server, and/or mobile application server. In embodiments, a user interface (and/or graphical user interface) for a shading object software application may be presented on a portable electronic device, although other computing devices could also execute instructions and present a graphical user interface (e.g., dashboard) to an individual. In

embodiments, shading object application software may generate and/or display a dashboard with different application (e.g., process) selections (e.g., weather, health, storage, energy, security processes and/or application processes). In embodiments, shading object application software may control operation of a shading object, communicate with and receive communications from shading object assemblies and/or components, analyze information obtained by assemblies and/or components of a shading object or umbrella, integrate with existing home and/or commercial software systems, and/or store personal data generated by the shading object, and communicate with external devices.

[00154] In embodiments, a portable electronic device may also comprise a mobile application stored in a non-volatile memory. In embodiments, a mobile application may be referred to as a SHADECRAFT or a SMARTSHADE mobile application. In embodiments, a mobile application (mobile app) may comprise instructions stored in a non-volatile memory of a portable electronic device, which can be executed by a processor of a portable electronic device to perform specific functionality. In embodiments, this functionality may be controlling of, interacting with, and/or communicating with a shading object. In embodiments, mobile apps may provide users with similar services to those accessed and may be individual software units with limited or specific function. In embodiments, applications may be available for download from mobile application stores, such as Apple's App Store. In

embodiments, mobile apps may be known as an app, a Web app, an online app, an iPhone app or a smartphone app.

[00155] In embodiments, a mobile app may communicate with a mobile

application server and/or also an application server. In embodiments, an

application server may be a backend server. In embodiments, application servers may consist of components, such as web server connectors, computer

programming languages, runtime libraries, database connectors, and administration software code which may be utilized to deploy, configure, manage, and connect these components on a web host. In embodiments, an application server may run and/or execute behind a web Server (e.g. an Apache or Microsoft IIS webs server) and may run and/or execute) in front of an SQL database (e.g. a PostgreSQL database, a MySQL database, or an Oracle database). In embodiments, web software applications may be executable computer instructions which run and/or execute on top of application servers, and are written in computer programming language(s) an application server may supports. In embodiments, web software applications may call runtime libraries and/or components an application server may offer. In embodiments, an application server may be referred to as a

SMARTSHADE application server and/or a SHADECRAFT application server.

[00156] In embodiments, a mobile app server may be utilized in between a mobile app and an application server. In embodiments, a mobile application may be mobile middleware software that interfaces with back-end systems (e.g., applications servers) to allow the mobile applications to communicate and interface with the application servers. In embodiments, a mobile app server may bridges a gap from existing infrastructure (e.g., application servers and/or networks and/or databases) to portable electronics devices (e.g., mobile devices). In embodiments, mobile application servers may take care of security, data management and other off-line requirements in order to minimize a load on application servers. In embodiments, a mobile application server may be referred to as a SHADECRAFT and/or SMARTSHADE mobile application server.

[00157] In embodiments, a SMARTSHADE and/or SHADECRAFT application software may comprise one or more application components and/or modules which may provide a user and/or individual with different features and/or functionality. For example, in embodiments, a SMARTSHADE and/or SHADECRAFT application software or system may comprise a personal care component and/or module, a shading object operation component and/or module, a shading object accessory commerce component, an e-commerce component and/or module, and a security / monitoring component and/or module (e.g., like a connection to an Internet of Things). In embodiments, a SMARTSHADE and/or SHADECRAFT application system or system may also provide storage or access to storage for individual's personal information, preferences, digital products (e.g., movies, pictures, and/or music), and/or security information. In embodiments, SMARTSHADE and/or

SHADECRAFT application software may refer to software (e.g., computer-readable instructions) executing on a portable electronic device (e.g., a mobile device such as a smart phone), an integrated computing device (in an intelligent shading object), an application server, a cloud server, and/or a mobile application server). In embodiments, different portions, components, modules of the SHADECRAFT application software may be located and executing on different devices and a user may be interacting with one or more of the devices. [00158] In embodiments, SMARTSHADE and/or SHADECRAFT application software (e.g. , mobile application software) may comprise one or more application components and/or modules which may provide a user and/or individual with different features and/or functionality. For example, in embodiments, a

SMARTSHADE and/or SHADECRAFT application software or system may comprise a personal care component and/or module; a shading object, umbrella, or shading charging system operation component and/or module; a shading object, umbrella or shading charging system accessory commerce component or module; a shading object, umbrella or shading charging system e-commerce component and/or module; and a shading object, umbrella or shading charging system security / monitoring component and/or module (e.g. , which may include a connection to an Internet of Things). In embodiments, a SMARTSHADE and/or SHADECRAFT application software or system (e.g. , mobile application software) may also provide storage or access to storage for an individual's personal information, preferences, device settings, digital products (e.g. , movies, pictures, and/or music), and/or security information.

[00159] In embodiments, SMARTSHADE and/or SHADECRAFT application software may refer to software (e.g. , computer-readable instructions) being executed by a processor on one or more of a mobile computing device (e.g. , a smart phone or a tablet), an integrated computing device in an intelligent shading object, umbrella or shading charging system, an application server, a cloud server, and/or a mobile application server), or any combination thereof. In embodiments, different portions, components, modules of the SMARTSHADE application software may be located and executing on different devices and/or systems (e.g. , mobile computing device, object, umbrella, shading charging system integrated computing device, application server, cloud server, mobile application server), and a user may be interacting and/or interfacing with one or more of the devices.

[00160] In embodiments, SMARTSHADE mobile application software (e.g. , computer-readable instructions executed by a processor of a mobile

communications device) may control operations of a shading object, intelligent umbrella and/or shading charging system. In embodiments, SMARTSHADE mobile application software may receive input and communicate messages, instructions, commands and/or signals directly and/or indirectly to a shading object, intelligent umbrella and/or shading charging system via wireless communication technologies.

[00161] In embodiments, a mobile computing device (e.g., smart phone) may communicate messages, instructions, commands and/or signals wirelessly directly to an intelligent shading object, umbrella, and/or shading charging system via a PAN (e.g., Bluetooth) transceiver and/or WiFi transceiver. In other words, in embodiments, messages, instructions, commands and/or signals from the mobile communications device may be communicated directly to a corresponding PAN transceiver and/or WiFi transceiver on an intelligent shading object, umbrella, and/or an intelligent shading charging system.

[00162] In embodiments, a mobile computing device (e.g., smart phone) may communicate instructions, messages, commands and/or signals to an application server and/or a mobile application server, which in turn may communicate instructions, messages, commands and/or signals to an intelligent shading object, umbrella and/or intelligent shading charging system. In embodiments,

SMARTSHADE mobile application may receive input regarding rotation of an object, umbrella and/or shading charging system about an azimuth axis, and may communicate messages, instructions, commands and/or signals to the umbrella to cause the umbrella to rotate a support assembly (e.g., central support assembly) and/or a stem assembly about an azimuth axis as is illustrated at least by reference number 140 in Figure 1 B.

[00163] In embodiments, as described above and below, input may be received via a touchscreen (e.g., a gesture, swipe, tap and/or other movement), via a camera (e.g., gesture recognition), via a user's voice, a keyboard, a mouse, a trackball, or other physical mobile computing device input device. Similarly, SMARTSHADE mobile application software may receive input regarding the following features and may communicate messages, instructions, commands and/or signals directly and/or indirectly to the umbrella, which results in the umbrella performing, initiating and/or activating the selected component, assembly and/or function:

a. Rotate a lower support assembly (and remainder of intelligent shading charging system) within a base assembly about an azimuth axis; as illustrated at least by reference number 188 in Figure 1 C. b. Rotate an upper support assembly with respect to a lower support assembly about an elevation axis (or tilting axis) as is illustrated at least by reference number 160 in Figure 1 B and similarly reference number 192 in Figure 1 C c. Deploy and/or retract arm support assemblies and/or

arms/blades as illustrated at least in Figures 6A, 6B and 7. d. Activate / deactivate one or more solar panels or solar arrays e. Activate / deactivate one or more cameras f. Place shading objects, intelligent umbrellas, and shading

charging systems into low-power mode g. Place shading object, intelligent umbrella, and shading

charging system into emergency power mode. h. Activate / deactivate one or more selected of sensors in a sensor assembly and/or other sensors in systems; i. Activate / deactivate proximity detector; motion detector; tilt detector and/or obstacle detector j. Activate voice recognition of shading object, intelligent

umbrella and/or shading charging system. k. Activate / deactivate one or more Bluetooth transceivers.

I. Activate / deactivate one or more WiFi transceivers. m. Activate / deactivate cooling system. n. Activate /deactivate one or more lighting assemblies and/or lighting system. o. Activate / deactivate one or more audio transceivers. p. Activate / deactivate infrared transceivers for remote control. q. Activate / deactivate DC or USB charger assemblies. r. Activate / deactivate wind sensors and/or wind turbines for intelligent shading object, intelligent umbrella, and/or intelligent shading charging system. s. Activate / deactivate GPS transceiver and/or digital compass. t. Activate / deactivate power tracking solar controller and/or converter. u. Place motion control printed circuit board, other printed circuit boards, selected components and/or assemblies into low power mode and/or emergency.

In embodiments, a mobile computing device (e.g., smart phone) may communicate messages, instructions, commands and/or signals to an application server and/or a mobile application server, which in turn may communicate instructions, commands and/or signals to an intelligent shading object, umbrella and/or intelligent shading charging system to execute and/or initiate processes, software modules and/or other functionality utilizing components, assemblies and/or devices of the shading object, intelligent umbrella and/or shading charging system. In embodiments, as described above and below, input may be received via a touchscreen (e.g., a gesture, swipe, tap and/or other movement), via a camera (e.g., gesture recognition), via a user's voice, a keyboard, a mouse, a trackball, or other physical mobile communication device input device. Similarly, SMARTSHADE mobile application software may receive input regarding the following features and may communicate messages, instructions, commands and/or signals directly and/or indirectly to the umbrella, which results in the umbrella performing, initiating and/or activating selected submodules, processes, and/or software functions described below, e.g., including but not limited to processes described in Figures 15A - I and 16A-D. As discussed above, this software may be initiated by computer-readable instructions stored in memory of a mobile computing device and executed by one o more processors of a mobile computing device, which presents a user interface (e.g., graphical user interface) including a dashboard where selections of processes may be made. In embodiments, a user interface is generated via a user interface component

[00165] Figure 15A illustrates an automated weather process according to

embodiments. In embodiments, when implementing a weather process (e.g., executing a weather process from a dashboard on a mobile

application), an integrated computing device 860 may leverage shading object, intelligent umbrella and/or intelligent shading charging sensors and data from other nearby similar devices and can communicate and/or transfer weather measurements for a microclimate with unprecedented accuracy and improve an individual's understanding of microclimate weather conditions. In embodiments, weather measurements and/or information may be obtained and/or received 1505 from sensors in a sensor module 750 via a weather variables PCB 810 and/or a shading object moving control PCB 895. In embodiments, sensor measurements stored may be air quality

measurements, UV measurements, temperature measurements, humidity measurements, wind measurements, and/or barometer measurements. In embodiments, sensor measurements may be stored 1506 in a memory 1030 and/or 1035 of an integrated computing device 1000. Individuals may be presented with localized and microenvironment weather conditions with unprecedented accuracy due to localization of these microclimate

measurements. This is an improvement on existing systems where sensor readings were normally obtained in public places.

[00166] In embodiments, weather thresholds and/or preferred weather conditions may be established and/or set for intelligent shading objects, intelligent umbrellas, and/or intelligent shading charging systems. In embodiments, an intelligent umbrella system may compare 1507 received sensor weather measurements against threshold and/or predefined threshold weather measurements. If the computing device executing the weather process (or software application) determines that these threshold weather measurements and/or weather conditions have occurred, the weather process may communicate 1508 an alert message identifying thresholds have been met and/or exceeded. In embodiments, an intelligent umbrella software system may communicate an alert or communication in a message, command, instruction and/or signal to a display device for display device to a user and/or to a sound reproduction device (e.g. , audio subsystem) for playback to a user. In embodiments, for example, users may also set desired weather and/or environment condition parameters that a user may wish to enjoy and/or wish to avoid. In embodiments, a computing device may store the set and/or established environmental condition parameters in a memory and/or database of a computing device of the intelligent umbrella and/intelligent charging shading system. For example, in embodiments, a user may establish that he and/or she may be wish to be alerted if a temperature is over 70 degrees (so that a user may enjoy the umbrella or shading charging system) and/or also if a

temperature exceeds 95 degrees (in order for a misting system to be activated to cool down an environment). In addition, for example, parameters may be set so that he or she may wish to be alerted in a wind speed over 15 miles per hour and/or if a relative humidity is over 60%. Further, for example, a user may wish to be alerted if an air quality reading has a particulate reading or other measurement determined to be unsafe. Likewise, settings may be established which identify conditions under which the user may wish to start utilizing an intelligent umbrella system. For example, a user may wish to start enjoying an outdoor environment and utilize an intelligent shading umbrella if it is after 9:00 am but before 6:00 pm, a temperature is over 65 degrees Fahrenheit, a humidity reading is under 75%, an air quality measurement indicates air with minimal contaminants, and/or a wind reading is less than 10 miles per hour. For example, in embodiments, if one or more of these parameters are met, as determined by received sensor

measurements, an intelligent umbrella and/or shading charging system may generate an alert message, command, instruction, and/or signal to alert of dangerous and/or desired conditions.

[00167] In embodiments, a weather process executing on a computing device of an intelligent umbrella or an intelligent shading charging system may also predict weather conditions for an upcoming period of time. Figure 15B illustrates predicting weather conditions in a weather process according to embodiments. In

embodiments, computing device may communicate 1510 with an access point (or its own hot spot access point), which in turn communications with a weather-related application server or weather-related mobile application server to obtain weather forecasts for an area. Information regarding weather forecasts (sun intensity, cloud cover, atmospheric conditions, air quality, etc.) for a geographic area where a shading object, intelligent umbrella or intelligent shading charging system is installed may be downloaded 151 1 to a memory 1030 or 1035 of a computing device in a shading object, intelligent umbrella or intelligent shading charging system.

[00168] In embodiments, the computing device may integrate 1512 the communicated weather forecasts, date information and/or received sensor measurements. In embodiments, the computing device may generate weather predictions 1513 for the microclimate based on the integrated information. In addition, a weather process may also alert 1514 individuals and/or users as to in climate conditions before these conditions occur.

[00169] Figure 15C illustrates a weather data gathering process on a periodic basis according to embodiments. In embodiments, an intelligent umbrella system and/or shading charging system may receive 1520 weather sensor measurements from sensors in, for example, a sensor module. In embodiments, an intelligent umbrella system may receive weather sensor measurements on a periodic basis from, for example, a sensor module. For example, an intelligent umbrella system may receive weather sensor measurements every minute, hourly, every six hours and/or daily. In embodiments, sensor measurements may be pulled from a sensor module and/or pushed from a sensor module. In embodiments, the plurality of sensor measurements may be stored 1521 by in a memory of a computing device. In embodiments, sensor measurements may be stored in database records in a computing device memory. In embodiments, a computing device in an intelligent shading umbrella and/or charging system may generate 1522 reports presenting sensor measurements over a period of time by retrieving measurements and/or time stamps a database in a computing device. In embodiments, an intelligent umbrella system or intelligent shading charging system may compare received sensor measurements against threshold and/or predefined threshold

measurements and generate 1523 an exception report identifying sensor measurements outside of predefined ranges or thresholds. In embodiments, an intelligent umbrella system may analyze received sensor measurements over a period of time and predict 1524 environmental conditions for an area surrounding an umbrella. For example, in embodiments, an intelligent umbrella system may analyze received barometer pressure, temperature and/or humidity measurements for a last 72 hours, and predict temperature and humidity conditions for a future period of time (e.g. , 24 hours or 48 hours). In embodiments, an intelligent umbrella system and/or shading charging system may package, group or combine sensor measurements for a period of time and communicate 1525 packaged sensor measurements to a third party server and/or computing device. One important novelty of the intelligent umbrella system and/or intelligent shading charging system is availability of microclimate information or an area surrounding the devices. In other words, an intelligent umbrella system and/or intelligent shading charging system may capture climate information of yards, public spaces, and other specific environments, whereas current weather systems tend to focus on larger geographic areas such as cities, or generate geographic areas (e.g. , cities or areas (e.g. , South Bay) and valleys (e.g. , Santa Clarita Valley or San Fernando Valley). In addition, sensor measurements and microclimate information may be obtained for private areas which have previously never been accessed. Problem with current weather forecasts and/or reports is weather reports are for a general area. In contrast, the new and/or present invention may provide microclimate and environmental information corresponding to an intelligent umbrella's exact location or a plurality of intelligent umbrella and/or shading charging system's locations. In embodiments, an intelligent umbrella system may communicate reports and/or measurements in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user.

[00170] In embodiments, an intelligent umbrella and/or shading charging system may receive communicated sensor measurements and/or solar panel

measurements. In embodiments, an intelligent umbrella and/or shading charging system may store communicated sensor measurements and/or solar panel measurements. In embodiments, an intelligent umbrella and/or shading charging system may integrate received sensor measurements and/or solar panel measurements with other software application software executing on one or more processors of the computing device of an intelligent umbrella system. For example, the other software application software may utilize received sensor measurements and/or solar panel measurements, in execution of its application software. For example, a computing device in intelligent umbrella and/or shading charging system may fetch computer readable instructions from a memory and execute the computer-readable instructions to execute a calendar process. In embodiments, a calendar process may utilize the received sensor measurements and/or solar panel measurements and integrate sensor measurements and/or solar panel

measurements into a calendar process (e.g., solar panel output and/or temperature / humidity readings may be displayed on days in a calendar software application). In embodiments, a user may establish thresholds and/or guideposts, and if received sensor measurements and/or solar panel measurements exceed established thresholds and/or guideposts, a calendar process may generate an alert which may also be displayed within a calendar process/application. In embodiments, a sun exposure process may fetch computer-readable instructions and execute the computer-readable instructions on one or more processors of the intelligent umbrella and/or shading charging system computing device. In embodiments, a sun exposure process may utilize sensor readings and/or solar panel readings and calculate a sun exposure reading for an individual utilizing the intelligent umbrella and/or shading charging system. In embodiments, a sun exposure process may calculate a sun exposure immediately after receiving solar panel measurements and/or sensor measurements. An advantage of this sun exposure index may be that the sun exposure index may be based on actual measurements for the individual at a specific location and not an estimate based on an estimated geographic location and an estimated location. In embodiments, a social media application (or process), e.g., Snapchat, may integrate sensor panel measurements and/or solar panel measurements into its process. For example, a user may setup alerts and/or thresholds to initiate communication by a social media process to a user. In embodiments, a social media process may receive sensor panel measurements and/or sensor measurements and integrate these measurements into messages or videos communicated to a social media process user and/or connections/friends of a social media process user. In embodiments, although descriptions above describe computer-readable instructions executable on a computing device of an intelligent umbrella and/or shading charging system, in other embodiments, the computer-readable instructions may be resident on a wireless electronic device, a third-party computing device, a portable electronic device, and/or an application server, and may be executable by processors on such devices. Continuing with this example embodiment, sensor readings and/or solar panel readings may be communicated to one or more of a wireless electronic device, a third-party computing device, a portable electronic device, and/or an application server, and be integrated with processes executing on those devices. [00171] In embodiments, computer-readable and computer-executable

instructions may be fetched from a non-volatile memory in an integrated computing device, loaded into a volatile memory, and may be executed by a processor in a computing device to recognize an individuals' voice and/or to perform a voice recognition process. This may occur in response to a user selecting a voice recognition button or icon on a dashboard of a shading object application software. In embodiments, a shading object central support assembly may also comprise one or more microphones. In embodiments, a shading object fabric, arms/blades and/or shading object arm support assemblies may comprise one or more microphones installed or attached thereto, or integrated within. In embodiments, for example, a user in a physical vicinity of a shading object may speak into a microphone, located on or within the shading object may capture a user's voice and generate an analog voice signal. In embodiments, an analog-to-digital converter may convert a voice to a digital signal and transfer the voice digital signal to a shading object computing device. In embodiments, a shading object computing device system may analyze the received digital voice, extract commands or information, and generate instructions based on the received digital voice signal. In embodiments, a computing device voice recognition process may recognize a voice command in a communicated voice signal, and then convert a recognized voice command into a text (or digital representation) command. In embodiments, the text command (or other digital representation of the command) may be communicated to other PCBs, subassemblies, and/or components of an intelligent shading object and/or umbrella. In embodiments, if commands are successfully converted and/or executed, a computing device voice recognition process may generate a confirmation audio signal. The computing device 860 may

communicate the confirmation audio signal to an audio system 875 and/or speakers 940. In embodiments, these instructions may be communicated to other PCBs, subsystems, subassemblies, and/or components of an intelligent shading object or umbrella in order to comply or react to voice instructions. For example, an integrated computing device may extract from a received voice signal, a command to obtain sensor measurements, (e.g., sunlight intensity, ozone, and/or wind measurements or reading). In embodiments, a computing device voice recognition process may provide support for additional commands as compared to a voice recognition engine 815. In embodiments, a computing device voice recognition process may allow for more customization (e.g., additional commands, dialects and/or languages) and be more directed to interacting with an intelligent shading object and/or umbrella. In embodiments, a computing device voice recognition process may integrate with an artificial intelligence voice engine. In embodiments, an artificial intelligence voice engine may be located in an integrated computing device. In embodiments, a computing device voice recognition process may communicate a voice signal to an artificial intelligence voice engine located in an external computing device. In this illustrative embodiment, a computing device 860 may communicate audio signals (including voice commands) to the external computing device, where the artificial intelligence voice engine may translated the audio signal to text or another digital representation, and then communicate the text or digital representation of the audio signal back to the computing device 860 for use by the computing device voice recognition process. In embodiments, a computing device 860 may also comprise a voice synthesis process for generating audio signals (e.g., including commands, prompts or responses) and

communicating these audio signals to an audio system 875 and/or one or more speakers. In embodiments, In embodiments, computer-readable and computer- executable instructions may be fetched from a non-volatile memory in an intelligent shading object, loaded into a volatile memory, and may be executed by a processor in a shading object computing device 860 to generate audio signals (e.g., synthesize speech and/or voice) and/or perform a voice synthesis process. In embodiments, a computing device voice synthesis process generates voice commands, responses or alerts and allows an intelligent shading object or umbrella to speak to an individual.

[00172] In embodiments, an integrated computing device system may

communicate (e.g., through wireless transceivers, wires, and/or circuit traces) with a shading object movement control PCB 895. In response, a shading object movement control PCB 895 may communicate with a weather variables PCB 810 to obtain sensor measurements from sensors coupled to and/or connected to a weather variables PCB 810. In embodiments, sensors may obtain measurements and may communicate these measurements to a weather variables PCB 810, a shading object control PCB 895, and/or to a shading object computing device 860. In embodiments, obtained measurements may be stored (for later use and/or analyzation) in a memory 1030 or 1035 of a shading object computing device 860, may be communicate via a sound system to a user, or may be displayed via a mobile software application.

[00173] In embodiments, in another illustrative example, a user may provide verbal instructions to rotate a shading object, a shading object computing device 860 may process the voice signal as described above (e.g., employing voice analyzation and/or voice recognition, and transmit instructions and/or commands to a first motor controller to cause a first motor to rotate a shading object a specified number of degrees (e.g., about a vertical axis). In embodiments, a user may provide verbal commands to a shading object remotely. For example, a user may provide verbal instructions to a mobile communication device (e.g., a smartphone), which may communicate the voice commands via a wireless communications protocol and/or Bluetooth to an audio/video receiver (e.g., a Bluetooth-enabled receiver) on the shading object. In this example embodiment, an audio/video receiver may communicate the voice-commands to a voice-recognition engine 815 which may convert the remotely-transmitted speech and communicate signals to the controller/processor, which may then operate in a manner described above [00174] In embodiments, a shading object may comprise a control panel (not shown). In embodiments, a control panel may comprise an input screen and/or a controller. In embodiments, an input screen may be a touch screen and/or a screen receptive to receiving electronic input via a pen. In embodiments, a control panel may present a graphical user interface with menu items to allow a user to control and/or operate many components of a shading object. In embodiments, a shading object may also comprise an organic light-emitting diode (OLED) display 1425 (see Figure 14). In embodiments, an OLED display may be a control panel. In embodiments, an OLED display 1425 may be a diagnostics monitor. In embodiments, an OLED display may display messages from a motion control PCB, a computing device, external computing devices, and/or a portable electronic device. [00175] In embodiments, computer-readable and computer-executable

instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in an integrated computing device to perform a personal health process. In embodiments, the computing device may be in a shading object, intelligent umbrella, and/or intelligent shading charging system. Fig. 15D illustrates execution of a health process by a computing device in an intelligent umbrella or shading charging system according to embodiments. This may occur in response to a user selecting a health button or icon on a dashboard of a shading object, intelligent umbrella or shading charging system application software system. Operation of a health process may comprise sensors (air quality and UV radiation sensors) providing 1530 information to a computing device 860 (directly or through a shading object movement control PCB). In other words, a computing device in a shading object, intelligent umbrella, and/or intelligent shading charging system may receive 1530 communicated sensor measurements from sensors. In embodiments, weather variable sensors (as described above) may generate measurements regarding environmental conditions in and around an intelligent shading object, umbrella or shading charging measurements. In embodiments, environmental and/or weather sensor measurements may include UV radiation measurements, air quality measurements, wind measurements, temperature measurements, humidity measurements, and/or barometer

measurements. In embodiments, weather variable sensors may communicate sensor measurements to a sensor module. In embodiments, a sensor module may process sensor measurements. In embodiments, a sensor module may communicate raw and/or processed sensor measurements to a motion control processor. In embodiments, a motion control processor may communicate sensor measurements to a computing device. In embodiments, a sensor module may communicate raw and/or processed sensor measurements to a computing device. [00176] In embodiments, the sensor measurements may be stored 1531 in a memory of an integrated computing device, a memory of a cloud server, a memory of an application server, and/or databases of the above-mentioned devices. In embodiments, a user may communicate parameters and/or thresholds to a personal health process database. In embodiments, a personal health process database may store received and/or communicated parameters and/or thresholds for sun exposure (e.g., time, radiation, sun light readings) and/or air quality in a memory. In embodiments, for example, a UV sensor 812 may provide UV index measurements and total UV exposure within a microclimate. In embodiments, an air quality sensor 81 1 may provide air quality readings within a microclimate of a shading object. In embodiments, a computing device 860 may receive sensor measurements, analyze the communicated measurements, and compare 1532 these measurements against personal health parameters and/or user-defined tolerances. In embodiments, for example, intelligent umbrella or shading charging system software may compare communicated and/or received sensor

measurements from a sensor module to stored sun exposure and/or air quality threshold measurements.

[00177] In embodiments, a computing device may generate 1533 alerts based on the comparison. In embodiments, for example, if thresholds are met and/or exceeded by the received sensor measurements, intelligent umbrella or shading charging system software may generate an alert message, command, instruction and/or signal. In embodiments, an intelligent umbrella or shading system software may communicate the alert message, command, instruction and/or signal to a display device for display to a user (to alert a user to a potentially dangerous environment and/or situation). In embodiments, for example, an intelligent umbrella or shading charging software system may communicate an alert message, command, instruction and/or signal to a sound reproduction device. In

embodiments, a display device may be a monitor and/or display which is part of an intelligent shading umbrella, a display on a portable computing device, a computing device display, and/or a display device in wireless and/or wireline communication with the intelligent shading umbrella. In embodiments, a sound reproduction device may be a speaker integrated and/or a part of an intelligent shading umbrella, a speaker on a portable electronic device, a speaker on a wireless communication device, a speech synthesis device integrated into the intelligent shading umbrella, and/or a speech synthesis device located external to the intelligent shading umbrella.

[00178] In embodiments, a personal health process may be integrated with or communicate with third-party vendors. In embodiments, a memory of a computing device may store computer-readable instructions, which when executed by a processor of a computing device, provide information regarding products and/or services related to health and environmental readings and/or measurements received from sensors of an intelligent umbrella and/or shading charging device. In embodiments, an external computing device may store computer-readable instructions, which when executed by one or more processors, provides information regarding products and/or services related to health and environmental sensor readings. In embodiments, an intelligent umbrella system may also analyze 1534 the communicated sensor measurements and compute/determine whether products may be offered to address environmental conditions of an intelligent umbrella and/or shading charging system. For example, if air quality sensor measurements indicate that air quality is poor in a local environment, an intelligent umbrella and/or shading charging system may determine that products such as allergen or dust masks should be offered for purchase. In addition, for example, if readings and/or measurements from a UV radiation sensor indicate that a user has been exposed to sunlight for more than a specified period of time (e.g., an hour during mid-afternoon), intelligent umbrella and/or shading charging system software (and/or health process) may generate product and/or service recommendations. For example, an intelligent umbrella and/or shading charging system software and/or health process may generate a recommendation for purchasing sunscreen (having a particular SPF), and/or aloe vera (if too much time was spent in the sun). In embodiments, an intelligent umbrella or shading charging system software and/or health process may forward, communicate and/or transfer received sensor measurements to an external third party computing system, which may analyze the sensor measurements and provide recommendations back 1535 to a computing device in the intelligent umbrella and/or shading charging system. For example, a third party computing device may include a sun reference exposure application. For example, in embodiments, if continued readings from a UV radiation sensor over a period of time (days and/or weeks) indicate high sun exposure, an intelligent umbrella and/or shading charging system software and/or health process may generate a recommendation to have users visit a dermatologist or another healthcare professional.

[00179] In embodiments, an intelligent umbrella and/or shading charging software system may communicate a recommendation in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound

reproduction device for playback to a user. In embodiments, in response to a recommendation, a user may select to purchase a recommended product and/or service. In embodiments, a computing device in an intelligent umbrella and/or shading charging system communicate purchase details. In embodiments, a computing device in an intelligent umbrella and/or shading charging system may communicate purchase details to a third party computing device to enable a transaction. For example, purchase details may be communicated to Amazon and/or a skin care manufacturer's commerce web site.

[00180] Figure 15E illustrates an energy process in a shading object, intelligent umbrella, and/or intelligent shading charging system implementing an energy process according to embodiments. In embodiments, computer-readable and computer-executable instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in a computing device in an intelligent umbrella and/or shading charging system to perform an energy process. In embodiments, for example, execution of an energy process may occur in response to a user selecting an energy button or icon on a dashboard of an intelligent umbrella and/or shading charging system mobile and/or software application. In embodiments, operation and/or execution of an energy process may provide visibility into an energy flow into and out of a shading object, intelligent umbrella and/or shading charging system and can present information regarding a value of utilizing solar energy.

[00181] In embodiments, an intelligent umbrella and/or shading charging system may receive 1540 communicated sensor measurements and/or solar panel measurements from, for example, sensors (or a sensor module) and/or solar panels (and/or solar panel charging assembles). In embodiments, for example, an integrated computing device may receive, directly or indirectly, solar power readings for one or more shading object, intelligent umbrella and/or intelligent shading charging system solar cells 825 and/or a power tracking solar charger 830. In embodiments, a computing device in an intelligent umbrella or shading charging system may store 1541 communicated sensor measurements and/or solar panel measurements in a memory of a computing device. In embodiments, an energy process may calculate 1542 money saved by utilizing solar power by comparing power utilized by solar object at a current utility rate. For example, the automated energy process may calculate these savings based on power utilized by specific hardware PCBs and/or components, such as lights, speakers, and/or cameras. In embodiments, an energy process can transfer and/or communicate 1543 solar power calculations to home automation providers (e.g. , Nest) which could enhance an efficiency of these systems (and an availability of these systems) by sharing solar power energy information as well as sensor information with home automation providers. In embodiments, if a shading object is not utilizing solar power, surplus solar power may be transferred back to an energy grid (either via an AC adapter and/or through home automation provider servers).

[00182] In embodiments, an energy tracking process may retrieve stored sensor measurements and/or solar panel measurements for a specified period of time and analyze the sensor measurements and/or solar panel measurements to determine 1544 if environmental changes have occurred. In embodiments, for example, an energy tracking process may determine if air quality in a user's microclimate has been reduced over a period of time. In embodiments, for example, an energy tracking process may determine if an air quality ozone reading in a user's microclimate has been reduced over a period of time, which may result in more harm to an individual and require more vigilant use of an intelligent umbrella and/or intelligent shading charging system. In embodiments, for example, an energy tracking process may determine in an UV radiation reading is increasing or decreasing over time thus making an environment safer. In embodiments, an intelligent umbrella system may generate 1545 a message, command, instruction, and/or signal to identify if a microclimate around an intelligent umbrella or shading charging system is becoming more green or energy efficient (e.g., ozone is not being reduced; air quality is improving, UV radiation is decreasing). In

embodiments, an energy tracking process of an intelligent umbrella or shading charging software system may communicate 1545 such information in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user.

[00183] In embodiments, an energy tracking process may store calculated solar energy levels in a memory of a computing device. In embodiments, an energy tracking process of an intelligent umbrella and/or shading charging software system may communicate calculated solar energy level values in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user. In embodiments, an energy tracking process of an intelligent umbrella and/or shading charging system may

communicate calculated solar energy readings to an external computing device, portable communications device, wireless communication device and/or an application server. For example, an energy process in a computing device of an intelligent umbrella and/or shading charging system may communicate calculate solar energy readings to a utility computing application server to identify solar power energy generation. In embodiments, an energy tracking process of an intelligent umbrella and/or shading charging system may communicate calculated solar power energy readings to other devices in a smart home and/or smart building in order to identify potential power available for use by other devices (e.g., smart and/or other devices) in a smart home, smart building and/or smart city network if an intelligent umbrella and/or shading charging system may transfer power (e.g., current and/or voltage) to other devices on a smart home, building and/or city network. [00184] FIG. 15F illustrates energy generation and energy consumption of an energy process in an intelligent umbrella and/or intelligent shading charging assembly according to embodiments. In embodiments, an energy tracking process of an intelligent umbrella system may receive 1550 communicated solar panel measurements. In embodiments, for example, an energy tracking process may receive and/or retrieve solar panel measurements and calculate energy captured by the solar panels. In embodiments, an energy tracking process of an intelligent umbrella and/or shading charging system may receive 1551 power usage measurements from solar panels, a rechargeable battery, DC-to-DC converters, other assemblies and/or components of an intelligent umbrella and/or intelligent shading charging system , and/or a solar power charging assembly. In

embodiments, a computing device in an intelligent umbrella and/or shading charging system may store 1552 received solar panel measurements and power usage measurements in a memory of a computing device. In embodiments, an energy tracking process may compare 1553 generated solar panel levels with energy provided to and/or consumed by assemblies of an intelligent shading object/umbrella. In embodiments, an energy tracking process may calculate 1554 additional energy/power available from the intelligent shading

object/umbrella/shading charging system , for utilization by other devices (e.g. , other intelligent shading umbrellas and/or electronic devices). In embodiments, one or more of a reachargeable battery, DC-to-DC converter, and/or solar power charging assembly may provide measurements as to power consumed by the devices, components, and/or assemblies on an intelligent umbrella and/or shading charging system coupled to the devices (e.g. , battery, converters, charging assembly). In embodiments, an energy tracking process of an intelligent umbrella software system may communicate 1555 1 ) a comparison of generated solar energy versus energy consumed by assemblies of an intelligent shading

umbrella/shading charging system and/or 2) excess power available for other devices in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user. In embodiments, an energy tracking process of an intelligent umbrella may

communicate 1 ) a comparison of generated solar energy versus energy consumed by assemblies of an intelligent shading umbrella and/or 2) excess power available for other devices to an external computing device, portable communications device, wireless communication device and/or an application server.

[00185] In embodiments, an energy tracking process may retrieve computer- readable instructions from a memory of an intelligent umbrella computing device and execute the computer-readable instructions on one or more processors of the intelligent umbrella or intelligent shading charging system's computing device.

Figure 15G illustrates an energy tracking process for one or more shading objects, intelligent umbrellas and/or shading charging system according to embodiments. In embodiments, an intelligent umbrella system may receive communicated sensor measurements and/or solar panel measurements from, for example, sensors (or a sensor module) and/or solar panels (or solar panel charging assembles) in an intelligent umbrella and/or shading charging system. In embodiments, a computing device in an intelligent umbrella and/or charging shading system may receive 1560 communicated sensor measurements from one or more other intelligent umbrellas and/or shading charging systems. In embodiments, a computing device in an intelligent umbrella and/or charging shading system may store communicated sensor measurements and/or solar panel measurements received from its sensor modules and/or solar panels in a memory of a computing device along with sensor measurements and/or solar panel measurements from other intelligent shading umbrellas/shading charging systems along with values identifying which intelligent shading umbrella. In embodiments, an energy tracking process may generate 1562 a report and/or message identifying intelligent shading umbrellas and/or shading charging systems and corresponding sensor measurements, solar panel measurements, and/or energy consumption values. In embodiments, a computing device may communicate and/or transfer 1563 stored sensor measurements, solar panel measurements, and/or energy consumption values for corresponding intelligent shading umbrellas and/or shading charging systems to a third party computing device, application server, a portable computing device, and/or wireless computing device. In embodiments, a third party computing device, application server, a portable computing device, and/or wireless computing device may generate a report and/or messages identifying sensor measurements, solar panel measurements, and/or energy consumption values and corresponding intelligent shading umbrellas and shading charging systems. In embodiments, a computing device may communicate 1564 a report in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user. In embodiments, a report may be presented to a user in a variety of manners. In embodiments, a three-dimensional rotating globe may present locations of intelligent shading objects and/or shading charging systems, and/or sensor measurements, solar panel measurements and/or energy consumption values. In other embodiments, a map may illustrate solar power measurements for a light source (e.g. , the sun), and/or a report identifying certain areas and installed intelligent shading umbrellas and/or shading charging systems along with corresponding sensor measurements, solar panel measurements, and/or energy consumption values.

[00186] Figure 15H illustrates object tracking in an energy process according to embodiments. In embodiments, an intelligent umbrella and/or shading charging system may receive objects 1565 that a user may wish to track in an environment and/or in the sky (e.g. , terrestrial objects such as satellites, flying objects and/or devices, and/or orbiting objects and/or devices). For example, a user may input such objects and/or devices to be tracked via a touch screen, a keyboard, voice prompts, or other input methods. In embodiments, an object or devices tracking process may determine coordinates 1566 for objects or devices to be tracked by utilizing a database stored in a memory of an intelligent umbrella shading object and/or shading charging systems. In embodiments, objects to be tracked may be planets, stars, earth-orbiting main made objects such as satellites or other orbiting devices, planes, rockets, black holes, etc. In embodiments, the coordinates may be determined by a third party computing device, application server, a portable computing device, and/or wireless computing device. In embodiments, an intelligent umbrella computing device may communicate 1567 tracked object coordinates to a GPS transceiver in an intelligent shading umbrella and/or shading charging system. In embodiments, a GPS transceiver may locate tracked objects utilizing input coordinates. Input coordinates could be GPS and/or other GNSS coordinates or other sky mapping coordinates. Input coordinates could be a tracking object identifier so that when a GPS/GNSS transceiver communicates 1568 signals, the GPS transceiver transmits a device identifier and other

information to GPS/GNSS satellites along with object coordinates. In

embodiments, a GPS transceiver receives 1569 communications (e.g. , signals and/or transmissions) back from a tracked object with identification of location of tracked object and device. In embodiments, an object tracking process compares received locations (and/or coordinates) received from tracked objects against original coordinates and/or positions and calculates 1570 a movement of tracked objects. In embodiments, multiple objects may be tracked simultaneously by object tracking process along with GPS transceiver. In embodiments, an object tracking process may generate 1571 a report and/or map illustrating positions of tracked objects. In embodiments, an object tracking process of an intelligent umbrella or shading charging system software system may communicate calculated tracked object positions in a message, command, instruction and/or signal to a display device for display to a user and/or to a sound reproduction device for playback to a user. In embodiments, an object tracking processor may display different types of maps. In embodiments, if an object does not have coordinates, an object tracking process may utilize a camera to capture an image of a sky or other outdoor environment. In embodiments, a certain time may elapse and a camera may capture a second image of a sky or other outdoor environment. In embodiments, an object tracking process may compare locations of tracked objects in a first image and a second image and may generate a report and/or a map comparing locations of a tracked object over time.

[00187] In embodiments, computer-readable and computer-executable

instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in a computing device to operate a security system and/or to perform a security process. This may occur in response to a user selecting a security button or icon on a dashboard of a shading object mobile and/or software application. Operation of a security process (or security portion of a software application) may be controlled by a shading object computing device. In embodiments, a security process (e.g. , initiated by selection in a shading object mobile application or another shading object software application) may receive communicated live and/or almost real-time video feeds and/or quality of image readings. In embodiments, the video and/or image feeds may be stored in a memory 1030 or 1035 of a shading object computing device, a memory of a cloud server, a memory of an application server, and/or databases. In

embodiments, an integrated computing device 860 can analyze video and/or images and issue safety alerts based on analyzation of video and/or images, motion detector activity, and/or over threshold air quality readings from an air quality sensor. In embodiments, an integrated computing device 860 may communicate video and/or images to an external device, such as an existing home security application server, such as ADT Security, which could enhance ADTs security capabilities, and/or also provide a platform for cross-promotion of security system software. In embodiments, an integrated computing device 860 may execute a home security process and communicate video and/or images, with or without sound, to emergency responders (e.g. , police, fire, security responders, FEMA) to aid in dealing with emergency situations.

[00188] Fig. 151 illustrates a backup process for a shading object, an intelligent umbrella and/or shading charging system according to embodiments. In

embodiments, computer-readable and computer-executable instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in a computing device to perform storage and/or backup process for an intelligent umbrella and/or shading charging system. In

embodiments, a storage and/or backup button or icon on a dashboard of a shading object and/or shading charging mobile and/or software application may be selected. For example, operation of a storage process may comprise an individual selecting a storage and/or backup icon and/or menu item (or providing a voice for a voice recognition command). In embodiments, for example, cameras may generate and/or capture videos and/or images, with or without sound, and communicate 1575 the video, sound and/or images to a computing device in an intelligent umbrella, shading object and/or intelligent shading charging system . Although the discussion below is made with respect to images from cameras, the description applies also to receiving, analyzing and/or storing sensor measurements, solar panel measurements, energy produced measurements and energy consumed measurements along with personalized settings of assemblies, components and/or computing devices of an intelligent umbrella and/or shading charging system. In embodiments, the communicated video, sound and/or images may be stored 1576 in a memory of an integrated computing device or an external memory and/or storage device (databases, memories of application servers and/or cloud servers). In embodiments, an integrated computing device may receive 1577 storage and/or video/image/sound settings to determine a composition of images and/or videos to be stored (e.g., video quality, length of storage, FPS). In embodiments, onboard sensor readings from shading object, intelligent umbrellas, and intelligent shading charging system sensors may also be stored. In embodiments, users may also opt- in and allow an integrated computing device to transfer 1578 video, sound, and/or images to local emergency service providers with live feeds during emergency events (e.g., fires, earthquakes, and extreme weather). In embodiments,

SMARTSHADE and/or SHADECRAFT application software executing on an integrated computing device may also provide storage or access to storage for individual's personal information, preferences, and/or digital products. In

embodiments, an individual may need additional storage for personal information and/or data such as photos, emails, messages, videos, etc. In embodiments, a shading object may have settings customized for one or more embodiments. [00189] In embodiments, an individual may select a backup menu item on a shading object mobile software application and settings may be backed up into a memory of an application server. In embodiments, if a shading object is replaced, settings can be restored by selecting a restore option and retrieving settings form an application server memory. In embodiments, an individual can also backup information stored on a portable electronic device to application server memory. For example, this information may be photos, emails, messages, videos, etc. In embodiments, home security video or video captured by shading object cameras may be stored in memory components 1030 and 1035 of an integrated computing device. In embodiments, videos may be erased after a certain period of time and/or may be communicated and stored in an off -site storage (e.g., application server and/or database). In embodiments, weather or environmental information along with shade usage, shade component usage statistics, and/or shade component maintenance information may also be communicated from a wireless transceiver to an access point (e.g., a shading object computing device hotspot or an external access point) to an application server and/or database for storage and/or later retrieval. For example, an integrated computing device 860 may keep track of how many times and/or for how long a motor is operated or efficiency of charging of a battery.

[00190] Figure 16A is a flowchart of a facial recognition process according to an embodiment. In embodiments, one or more digital cameras may capture 1601 an image of an individual within a shading area of the shading object. In

embodiments, one or more digital cameras 857 may communicate 1602 images and/or videos, with or without sound, to an integrated computing device 860 (or alternatively a motion control PCB 865). In embodiments, computer-readable and/or computer-executable instructions may be retrieved from a non-volatile memory of an intelligent umbrella computing device, loaded into a volatile memory and executed by a processor to execute or initiate a facial recognition process or method. In embodiments, a computing device facial recognition process may compare 1603 a communicated image to existing images stored in a memory of an intelligent umbrella computing device (e.g., either a volatile or non-volatile memory) to determine if an individual whose face was captured is an authorized intelligent umbrella user. If a facial recognition process determines 1604 an individual is an authorized user, an intelligent umbrella computing device 860 may generate commands, instructions and/or signals and communicate the generated

commands, instructions, and/or signals to a motion control PCB. In embodiments, in response, a processor on a motion control PCB 895 or a processor on an integrated computing device may retrieve 1605 personalized settings, from a memory, for an individual. In embodiments, a motion control PCB 895 may generate 1606 messages, instructions, commands, and/or signals to cause shading object, intelligent umbrella, and/or shading charging system assemblies and/or components to be moved and/or activated in conformance with retrieved personal settings. In embodiments, for example, an individual may have an existing stored audio playlist and upon facial recognition, a processor may initiate playing of a personal playlist, utilizing an audio system and speakers 875 if an individual's image is recognized. This may occur even if a playlist is on an external device (e.g. , a portable music player and/or portable electronic device). As another illustrative example, an individual may have specific settings, stored in a memory, for stem assemblies 106, base assemblies 105, center support assemblies 107 (e.g. , position upper assembly 1 12 of a center support assembly 107 at an angle 20 degrees from a lower assembly 1 13 of a center support assembly), and/or arm support assemblies 108 (for shading objects or umbrellas) or housing/enclosures 182, lower support assembly 187, upper support assembly 191 , hinging assembly 190, and/or arm support assemblies 193 (Fig. 1 C). As an addition example, an individual may have a personalized setting, stored in a memory, that activates a misting system if a temperature reading and/or humidity reading is above a threshold, and an individual is located within a shading area. In embodiments, a motion control PCB 895 may communicate activation instructions, commands, and/or signals to a misting system to address the high humidity situation. In embodiments, an intelligent umbrella computing device may generate messages, commands, instructions and/or signals to assemblies and/or components based on retrieved personalized settings.

[00191] Figure 16B illustrates an infrared detection process according to embodiments. In embodiments, one or more of the digital cameras 857 may comprise an infrared detector. In embodiments, an infrared detector may comprise one or infrared light sources and an infrared sensor. In embodiments, infrared sensors may detect 1610 a light wavelength in an Infra-Red (IR) spectrum. In embodiments, one or more LED lights (or other light assemblies) may produce and/or generate light at an IR wavelength and an IR sensor may receive and/or analyze an intensity and/or amplitude of a received light after the light is reflected off of an object and/or surface. In embodiments, when an object is close to an IR sensor, light generated from an IR LED (or IR light assembly) may bounce and/or reflect off an object back into an IR light sensor. In embodiments, this large jump in an IR intensity may be compared against a threshold, and if a threshold is reached (e.g. , the return IR intensity is higher than a threshold), then an infrared detector may generate 161 1 a signal indicating that an object is located within an area being monitored or viewed by an infrared detector. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, a camera 857 may be activated and begin to capture 1612 images, sound and/or video, and communicate 1613 captured images and/or video, with or without sound, to an intelligent umbrella computing device 860. In embodiments, images, sounds and/or video may be communicated to a motion control PCB in an intelligent umbrella. In embodiments, if an infrared detector generates a signal indicating that an object (and/or individual) is present, a lighting assembly (e.g. , LED lights) 870 may also be activated 1614 and lights of a lighting assembly may be directed in an area surrounding an intelligent shading object and/or directly to an area where an object is detected. In embodiments, one or more cameras 857 and/or one or more lighting assemblies 870 may be activated, which results in better images and/or video of an area surrounding a shading objet being captured and/or communicated to a shading object computing device. This is yet another example of how an intelligent shading object provides additional benefits of not only capturing images of its surrounding area but also being utilized as a security device for an environment in which an intelligent shading object is located. In

embodiments, one or more cameras may be activated and/or a lighting assembly may not be activated so that the one or more cameras may capture images and/or video without an individual who has been detected knowing that images and/or video is being captured. In embodiments, one or more cameras may be activated for a predetermined period of time (e.g. , 30 seconds, a minute, five minutes).

Continuing with this embodiment, captured images, video and/or sound for a predetermined time duration may be communicated to an intelligent umbrella computing device. In embodiments, communicated images may be stored 1615 in a memory of an intelligent umbrella computing device and/or streamed via a wireless transceiver to external computing devices (e.g. , mobile phones, application servers, laptops, tablets, etc.). In embodiments, storing and/or streaming 1616 of communicated images may occur automatically (e.g. , for example, if a motion detector detects an object's movement), one or more cameras may capture images and/or video for 30 seconds, may automatically communicate captured images and/or video to an intelligent umbrella computing device, which in turn may stream, via a wireless transceiver, captured images and/or video to an external computing device, e.g., an owner's smart phone.

[00192] Fig. 16C illustrates a thermal detection process according to

embodiments. In embodiments, an infrared detector and/or motion detector may activate 1620 a thermal imaging camera 857 upon detection of movement. In embodiments, a thermal imaging camera may activate 1621 on its own due to movement of an intruder and/or object, or may be periodically or continuing capturing images and/or video. In embodiments, a signal-processing unit of a thermal imaging camera 857 may communicate 1622 thermal images and/or thermal video to a shading object computing device 860 for analysis and/or storage. In embodiments, an intelligent umbrella shading device may retransmit 1623 thermal images and/or video to external computing devices. In embodiments, for example, thermal images may be communicated via, for example, a wireless transceiver, to a portable computing device, a smart phone, a tablet, an application server and/or another computing device. In embodiments, a thermal image may appear as various colors depending on and/or corresponding to an intensity of an infrared image. In embodiments, a thermal imaging camera allows additional benefits of not having to activate a lighting assembly in order to capture images and/or videos of an area surrounding an intelligent shading umbrella or object. In addition, by not activating a lighting assembly 870, an intruder or moving object may not be aware that a digital camera 857 may be capturing an image or video of an area where an intruder or object is located.

[00193] In embodiments, computer-readable and/or computer-executable instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in a computing device to perform an intelligent umbrella and/or shading charger security process. In embodiments, a security process may be initiated and/or executed by selection of a security button and/or icon on a dashboard corresponding to a security process. In embodiments, operation of a security process (or security portion of a software application) may be controlled by an intelligent umbrella computing device and/or a user operating an intelligent umbrella computing device. In embodiments, a security process may be initiated, controlled, or executed on a portable computing device, a wireless communications device (e.g., a smartphone), a tablet, a laptop computer, a server, an application server, or combination thereof, utilizing computer-readable

instructions that are loaded into one or more memories and executed by one or more controllers and/or processors. In embodiments, a security process may be initiated and/or executed automatically at 1 ) a certain predefined time; 2) under certain predefined conditions (e.g., it is evening and an individual will not be home for a period of time); and 3) upon a trigger received from a connected and/or coupled device (e.g., a motion detector, infrared detector, an external device (e.g., NEST home security system). [00194] FIGURE 16D illustrates a security process for an intelligent umbrella and/or intelligent shading charging systems according to embodiments. In embodiments, a security process may be implemented in a home environment and/or building environment (where shading objects and/or intelligent umbrellas are installed) or an outside area (where, for example, intelligent shading charging systems are installed). In embodiments, a security process may be executed with respect to a single shading object, intelligent umbrella and/or shading charging system, or multiple shading objects, intelligent umbrellas and/or shading charging systems. In embodiments, parameters, thresholds, and/or settings may be created and/or established 1640 to identify dangerous conditions or occurrences (e.g., alerts may be established for dangerous sensor or weather conditions). In embodiments, parameters, thresholds, and/or settings may be created and/or established 1641 based, at least in part, on motion detection, proximity detection, infrared detection, public emergencies and/or user-defined emergences. In other words, users can set parameters for triggering alarm and emergency modes based on things like motion detector, on-board sensor readings, or seismic activity. In addition, users may set parameters to receive alerts on dangerous on-board sensor readings and could share these alerts with loved ones, emergency service providers, and other providers such as ADT, as well as other home automation manufacturers like Nest Product. In embodiments, a computing device may receive 1642 alert messages, out-of-tolerance readings, emergency notifications, motion or proximity detection readings, and/or seismic readings. In embodiments, based on the receipt of the above-mentioned readings, messages and/or notifications, a computing device may transmit signals, commands, and/or messages to activate 1643 a camera. In addition, a computing device may also transmit signals, commands, and/or messages to activate and turn on a sound reproduction device (e.g., speakers), a display device, a lighting assembly, and/or a wireless transceiver. In embodiments, messages, signals, commands may be transmitted to shut down one or more assemblies and/or components in above- identified devices.

[00195] In embodiments, an intelligent umbrella and/or shading charging system security process (e.g., initiated and/or executed as discussed above) may receive 1644 communicated live and/or almost real-time image, sound, and/or video feeds. In In embodiments, an intelligent umbrella security process may also receive a quality value of video and/or images. In embodiments, a video quality value may be, e.g., 360p, 720p, and/or 1080p. In embodiments, an image quality value may be pixels per inch (72 ppi, 240 ppi, 360ppi and/or 720 ppi). In embodiments, communicated video and/or image feeds may be stored 1645 in a memory 1030 or 1035 of a computing device of an intelligent umbrella and/or shading charging system. In embodiments, communicated video and/or images may be stored in a memory of a cloud server, a memory of an application server, and/or databases, and/or combinations thereof. [00196] In embodiments, an integrated computing device 860 can analyze video and/or images and issue safety alerts 1646 based on analyzation of video and/or images, motion detector activity, and/or over threshold air quality readings from an air quality sensor. In embodiments, an integrated computing device 860 may communicate video, sound and/or images to an external device, such as an existing security application server, such as ADT Security, which could enhance ADTs security capabilities, and/or also provide a platform for cross-promotion of security system software. In embodiments, an integrated computing device 860 may execute a security process and communicate video and/or images, with or without sound, to emergency responders (e.g., police, fire, security responders, FEMA) to aid in dealing with emergency situations. In embodiments, a computing device may communicate messages, signals, commands and/or instructions to assemblies and/or components of an intelligent umbrella and/or shading charging system to place the umbrella and/or shading charging system into an alarm and/or emergency mode. In other words, all electrical components may be shut down, the arms/blades and/or arm support assemblies may be retracted, and/or transceivers may be shutdown. [00197] Sun and other environmental and/or weather conditions may damage an individual's skin or impact an individuals' health. In embodiments, SMARTSHADE application software may provide a user with medical monitoring features and/or functionality. In embodiments, for example, a shading object system may detect a user is within a shading area. In embodiments, a shading object system may activate a camera to capture an image of the individual. In embodiments, a captured image may be compared to images stored in a memory of a shading object system to identify if an individual is known by a shading object system. In embodiments, facial recognition may be performed on the image to assist in identifying an individual. Continuing with this illustrative example, if an individual is not known and/or recognized, characteristics of an individual's image may be stored in a memory of a shading object system for future reference. In

embodiments, characteristics may include hair color, hair length and/or scalp coverage, skin color and/or skin tone, number of freckles, presences of moles and/or lesions. In embodiments, characteristics may comprise medical history such as respiratory illnesses (e.g., asthma), skin cancer, heart conditions, etc. In embodiments where an individual is recognized, a shading object computing device 860 may retrieve a user's characteristics and/or measurements. In embodiments, a shading object computing device may retrieve and/or capture environmental conditions. For example, a shading object computing device may retrieve an air quality measurement, an ozone measurement, a sunlight intensity measurement, a humidity measurement, and/or a temperature measurement. In embodiments, a shading object computing device may analyze the retrieved individual

characteristics and/or the received environmental conditions and provide

recommendations to an individual as potential actions. For example, if an air quality measurement is low or poor and an individual has asthma, a shading object computing device 860 may provide recommendations for an individual to make sure they have their asthma medication and/or limit their time in the environment. As another illustrative example, if an individual's characteristics indicate that an individual and/or an individual's family has a history of skin cancer, a local time is between 10:00 am and 3:00 pm (highest portion of sunlight, and there is no cloud cover, a shading object computing device may generate recommendations such as requesting that a user stay within a shading area and/or apply sunscreen. In addition, a shading object computing device may analyze the individual's image, identify that a user is sunburned, and may recommend that an individual apply aloe or skin conditioner to a sunburn and/or stay within a shading area.

[00198] In embodiments, a shading object computing device 860 may also capture images of an individual and transfer these images (either still images and/or video images) to a third party provider. In embodiments, a third party provider may be a medical professional (e.g., like a dermatologist, a surgeon, or a general practitioner). In embodiments, a medical professional may analyze an image and/or videos and provide an individual with feedback related to an image. For example, a shading object system camera 857 may capture an image of a mole on an individual's chest and/or back. A medical professional may provide a preliminary evaluation of an individual and provide a recommendation to a user for future actions. In embodiments, a shading object system camera 857 may provide a video of an individual's movement after, for example, a surgery. In embodiments, images and/or videos may be provided in real-time, e.g., such as in a Snapchat and/or Facetime. In embodiments, images may be communicated from a shading object camera 857 through a wireless transceiver 1010 or 865 to an access point and onto a global communications network such as the Internet. In embodiments, images and/or videos may be communicated through a mobile application server (middleware) to an application server (e.g., a SMARTSHADE and/or

SHADECRAFT application server). In embodiments, images and/or videos may be communicated through the Internet to a medical professional's web server, for example.

[00199] In embodiments, user behavior characteristics may be desired by many organizations. In embodiments, a shading object and its multiple components may generate data which may be desirable to third parties. For example, obtained weather information, air quality readings, UV readings, wind readings, and user selections in a software application and/or other shading object or umbrella features. In embodiments, this raw information may be licensed to third parties as real-time or near real-time user information. In embodiments, access to different types and/or scope of data may be a basis for different subscription models. In embodiments, data on in-app purchases (via e-commerce features) may provide insight into decisions that shading object individual owners make (e.g., drivers behind consumer spending patterns). In embodiments, third parties may be able to purchase ad-space on Shadecraft devices and/or assemblies. In embodiments, utilizing obtained sensor data obtained from a shading object, a third party could deliver targeted ads based on region, climate, user behavior, as well as other metrics. In embodiments, in-app purchasing ability may give advertisers data on conversion rates & revenue, making ad space more valuable because ad success may be tracked and/or refined. In embodiments, revenue sharing models may also increase ad revenue and incentivize commercial customers to utilize shading objects and/or shading object application software.

[00200] Sun and other environmental and/or weather conditions may damage an individual's skin or impact an individuals' health. In embodiments, SMARTSHADE application software may provide a user with medical monitoring features and/or functionality. In embodiments, for example, a shading object system may detect a user is within a shading area. In embodiments, a shading object system may activate a camera to capture an image of the individual. In embodiments, a captured image may be compared to images stored in a memory of a shading object system to identify if an individual is known by a shading object system. In embodiments, facial recognition may be performed on the image to assist in identifying an individual. Continuing with this illustrative example, if an individual is not known and/or recognized, characteristics of an individual's image may be stored in a memory of a shading object system for future reference. In

embodiments, characteristics may include hair color, hair length and/or scalp coverage, skin color and/or skin tone, number of freckles, presences of moles and/or lesions. In embodiments, characteristics may comprise medical history such as respiratory illnesses (e.g., asthma), skin cancer, heart conditions, etc. In embodiments where an individual is recognized, a shading object computing device 860 may retrieve a user's characteristics and/or measurements. In embodiments, a shading object computing device may retrieve and/or capture environmental conditions. For example, a shading object computing device may retrieve an air quality measurement, an ozone measurement, a sunlight intensity measurement, a humidity measurement, and/or a temperature measurement. In embodiments, a shading object computing device may analyze the retrieved individual

characteristics and/or the received environmental conditions and provide

recommendations to an individual as potential actions. For example, if an air quality measurement is low or poor and an individual has asthma, a shading object computing device 860 may provide recommendations for an individual to make sure they have their asthma medication and/or limit their time in the environment. As another illustrative example, if an individual's characteristics indicate that an individual and/or an individual's family has a history of skin cancer, a local time is between 10:00 am and 3:00 pm (highest portion of sunlight, and there is no cloud cover, a shading object computing device may generate recommendations such as requesting that a user stay within a shading area and/or apply sunscreen. In addition, a shading object computing device may analyze the individual's image, identify that a user is sunburned, and may recommend that an individual apply aloe or skin conditioner to a sunburn and/or stay within a shading area. [00201] In embodiments, a shading object computing device 860 may also capture images of an individual and transfer these images (either still images and/or video images) to a third party provider. In embodiments, a third party provider may be a medical professional (e.g., like a dermatologist, a surgeon, or a general practitioner). In embodiments, a medical professional may analyze an image and/or videos and provide an individual with feedback related to an image. For example, a shading object system camera 857 may capture an image of a mole on an individual's chest and/or back. A medical professional may provide a preliminary evaluation of an individual and provide a recommendation to a user for future actions. In embodiments, a shading object system camera 857 may provide a video of an individual's movement after, for example, a surgery. In embodiments, images and/or videos may be provided in real-time, e.g., such as in a Snapchat and/or Facetime. In embodiments, images may be communicated from a shading object camera 857 through a wireless transceiver 1010 or 865 to an access point and onto a global communications network such as the Internet. In embodiments, images and/or videos may be communicated through a mobile application server (middleware) to an application server (e.g., a SMARTSHADE and/or

SHADECRAFT application server). In embodiments, images and/or videos may be communicated through the Internet to a medical professional's web server, for example.

[00202] In embodiments, SMARTSHADE and/or SHADECRAFT application software (the software being instructions loaded into memory of a smartphone, a mobile application server and/or application server). This may be referred to as a shading object system. In embodiments, a shading object system may control operations of a shading object utilizing the SMARTSHADE and/or SHADECRAFT application software. For example, a portable electronic device may present a menu of options for controlling one or more shading object components. In embodiments, an individual may select a menu item of a shading object mobile app, which may result in commands and/or instructions being transmitted to different components of a shading object and actions being performed. In embodiments, shading object components, after receiving commands and/or instructions, may generate measurements. In embodiments, a shading object may communicate these messages to a shading object computing device 860. In embodiments, measurements may be stored in a memory and/or displayed on a monitor 1425 of a shading object computing device. In embodiments, a shading object component may communicate an acknowledgement message and/or a status indicator to a shading computing device system, which may be stored in a memory and/or displaying on a shading object computing device. In embodiments, a shading computing device system may control operation of multiple shading objects. For example, an individual may control operation of any of the motors in a number of shading objects in via a shading object mobile application on a portable electronic device (e.g., may cause a stem assembly and a central support assembly to rotate around a base assembly). For example, a shading object control process may request measurements from one or more sensors (e.g., sunlight sensors, air quality sensors, tilt sensors). In addition, a shading object control process may activate and/or operate a camera.

[00203] In embodiments, a shading object computing device and/or mobile app may allow individuals to purchase, replace and/or return shading object

accessories. In embodiments, a shading object computing device and/or mobile app may present a user with various accessories for purchase. For example, an individual may be able to purchase shading object arms/blades, shading fabric, batteries or solar cells for a shading object. In embodiments, a shading object computing device and/or mobile application may also present a menu item allowing individuals to connect to Internet and purchase items from other e-commerce web sites.

[00204] In embodiments, a shading object computing device and mobile app may allow individuals to diagnose problems with shading object operation. In

embodiments, an individual may initiate diagnostics for a shading object by selection of a menu item in a mobile device application. In embodiments, commands, instructions and/or signals may be communicated to components of a shading object. Measurements and/or signals may be received back from components and if these measurements and/or signals exceed a threshold, a shading object computing device and/or mobile application may generate an error condition and/or message. In embodiments, this error condition and/or message may be communicated to a display 1425. For example, diagnostics may be run on any one of the first, second and/or third motors. In addition, diagnostics may be run on any one of shading object sensors (e.g., environmental sensors, tilt sensor, motion or proximity sensors). [00205] In embodiments, an intelligent shading object or umbrella may be a device on an Internet of Things (loT). In embodiments, an Internet of Things (loT) may be a network of physical objects— sensors, devices, vehicles, buildings, and other electronic devices. These objects may comprise items embedded

with electronics, software, sensors, and network connectivity, which enables these physical objects to collect and exchange data with each other and/or with servers connected via a global communications network (e.g., an Internet). In

embodiments, the loT may sense and/or control objects across existing wireless communication network infrastructure an global communications network infrastructure. In embodiments, integrating of devices via loT may create opportunities for more direct integration of a physical world into computer-based systems, which may result in improved efficiency, accuracy and economic benefit. In addition, when loT is augmented with sensors and actuators, loT may be integrated or enabled with a more general class of cyber-physical systems, e.g. , smart grids, smart homes, intelligent transportation and smart cities. In

embodiments, in loT, for example, may be uniquely identifiable through its embedded computing system but is able to interoperate within the

existing Internet infrastructure. If a shading object is integrated into loT, for example, a shading object may be part of a smart home and/or smart office. For example, a shading object enable with loT capability, because it may incorporate cameras, may be able to communicate with or be integrated into a home or office security system. Further, if an individual has a smart home, an individual may be able to control operation of, or communicate with an intelligent shading object or umbrella as part of an existing smart home software application (either via a smart phone, mobile communication device, tablet, and/or computer). In addition, an intelligent shading object, if part of loT, may be able to interface with, communicate with and interact with an existing home security system. Likewise, an intelligent shading object may be able to be an additional sound reproducer (e.g. , via speaker(s)) for a home audio and/or video system that is also on the loT. In addition, an intelligent shading object may be able to integrate itself with an electronic calendar (stored on a computing device) and become part of a

notification or alarm system because it will identify when upcoming meetings are occurring. In embodiments, an intelligent shading computing device may utilize artificial intelligence to determine which music to play from a portable electronic device. In embodiments, a memory of an intelligent shading object may have user playlist information, e.g. , genre played during certain timeframes, favorites, song played at specific times. In embodiments, an integrated computing device 860 may receive a request to play music and may select a playlist of music based on user's preferences and or usage factors. After a playlist is selected, a shading object computing device 860 may stream selected music from an individual's portable electronic device through a wireless network transceiver and to a sound reproduction system.

[00206] In embodiments, a shading object computing device 860 may have computer-readable instructions, stored in a non-volatile memory, which when executed by a processor, may execute an artificial intelligence process and may provide artificial intelligence functionality. For example, a shading office computing device 860 may receive measurements from environmental sensors, as described above, analyze the measurements, and make recommendations to users regarding sun exposure, heat exposure, and/or hydration. For example, a shading object computing system 860 may receive and analyze temperature measurements and sun intensity measurements, and based on the analysis, provide a

recommendation to a shading object user how long the user should be out in the environment or when an individual should hydrate if in the environment. In addition, an individual can input health risk factors, and a shading object computing device 860 may also consider health risk factors when making a recommendation. For example, if a temperature is high and humidity is high, and a user has a heart condition, a shading object computing device system may recommend that a user only spend 30 minutes under a shading object and that during this time, the individual should drink eight ounces of water. [00207] In embodiments, an integrated computing device 860 may also recommend shading object positioning throughout a day based on weather forecasting and/or sun tracking. In embodiments, a shading object computing device may have stored previous positions of different portions of a shading object (e.g., rotation angle of a stem assembly, angle of an upper assembly 1 12 with respect to a lower assembly 1 13 of a central support assembly), and may provide a recommendation of a starting shade position based on previous positions of different portions of a shading object. In addition, a shading object computing device 860 may also consider current environmental factors when making recommendations of a shading object starting position and/or positions throughout a day. In addition, a computing device 860 may consider environmental factors and/or sensor readings and provide a recommendation of when sunburn may occur if 1 ) no sunscreen is used; 2) sunscreen with a specific sun protection factor (SPF) is used; and/or 3) sunscreen is used in a partly cloudy environment.

[00208] In embodiments, a computing device 860 integrated into a shading object or umbrella may communicate with or interface with an external artificial intelligence system, such as the Amazon Alexa system or the Google Now system. In embodiments, a user may speak into a microphone located on or integrated within a shading object central support assembly 107 (for example) and ask questions or make requests. These voice signals are converted by the shading object computing device 860 and/or a voice recognition engine or module 815, as discussed previously, and communicated to an external artificial intelligence system (Amazon Alexa and/or Google Now) via a wireless transceiver, a PAN transceiver, and/or a wireless hotspot. In embodiments, a shading object computing device 860 may also comprise an artificial intelligence engine, which may be located on a computing device PCB and perform similar functions to an external artificial intelligence engine (such as Amazon Alexa and/or Google Now). In embodiments, an external artificial intelligence engine may responds to requests, transfer requests to other application servers for processing, and/or perform analysis based on a user request. After an action has been performed and responses and/or confirmations obtained, the external artificial intelligence engine may communicate the

responses, answers, and/or confirmations to a shading object computing device. An integrated computing device may provide the responses, answer, and/or confirmations to an individual via a sound reproducing apparatus (e.g., speakers) and/or a visual display apparatus (display, monitor, and/or screen).

[00209] In embodiments, a shading computing device may also detect obstacles in a shading area of the shading object. In embodiments, an obstacle may be in a path or orbit of where a shading object may be moving (e.g., a person may be located in an area where shading arm supports are to be deployed and/or a lamp or other object may be in an area where an upper assembly of the central support assembly is being moved in response to a command. In embodiments, a shading object computing device 860 may receive an image or images from one or more shading object camera. In embodiments, a shading object computing device 860 may analyze the captured images and determine if a person and/or object (e.g., an obstacle) is in a path of travel of one or more shading object components. If a shading object computing system determines an obstacle is present, a notification may be communicated to an individual. In embodiments, a notification is sent to a sound system, and an alarm and/or voice warning may be sent out over a shading object speaker. In embodiments, a notification may be sent to a control panel and/or a portable electronic device and a communicated notification message may be displayed to a user. In embodiments, a shading object computing device may communicate commands, instructions and/or signals to controllers and/or controller PCBs to cause motors (e.g. , a first, second or third motor) to stop movement, or to redirect movement away from a located obstacle. In embodiments, a shading object computing device 860 may continue to communicate notifications and/or commands, instructions and/or signals until an obstacle moves away from an area of concern (or shading area) or is removed from an area of concern (or shading area). In embodiments, a shading object computing device may also receive notifications, commands, instructions and/or signals from proximity sensors and/or motion sensors, and identify if an obstacle is in a movement path of one or more of a shading objects assemblies and/or components. If a shading object computing device 860 identifies an obstacle, then, as discussed above, notifications may be sent to portable electronic devices and/or sound systems, and commands, instructions, and/or signals may be communicated to controllers and/or controller PCBs for motors in order to stop a motor's operation and / or redirect a direction of an assembly's movement path.

[00210] In embodiments, multiple shading objects may be coupled together. In embodiments, by coupling multiple shading objects together mechanically and/or electrically, an individual may be able to operate and control intelligent shading objects or umbrellas in unison (e.g. , in other words, same or similar commands, instructions, and/or signals may be sent to multiple shading objects by a single control computing device). In addition, if solar cells are generating an excess power, e.g. , more than is necessary for a single rechargeable battery, excess power may be transferred to a rechargeable battery in another shading object coupled to an original shading object. In embodiments, if there is excess power generated by solar cells in a number of coupled shading objects and other local shading objects may not utilize the power, a shading object may transfer and/or relay excess power to an electricity grid and an individual may receive discounts and/or credits for any power delivered back to a grid. In embodiments, a portable electronic device, through a shading object mobile application, may control multiple coupled shading objects. In embodiments, a laptop or other computing device may control multiple coupled shading objects. In embodiments, multiple shading objects may communicate with each other via a personal area network. In embodiments, multiple shading objects may communicate with each other via wireless LAN transceivers. [00211] In embodiments, a cable comprising data, control and power lines may be connected and/or attached between shading objects. In embodiments, a cable may be housed in a base assembly 105 and may extend to a power connector on another shading object. In embodiments, a cable may be housed in a stem assembly 106 and/or a center support assembly 107 and may extend to a power connector on another shading object.

[00212] In embodiments, a shading object may comprise an automatic button and a manual button. In embodiments, if a manual button is depressed and/or selected, a shading object may need to be operated in a manual fashion. In embodiments, a shading object may comprise a shutoff button or actuator. In embodiments, if an emergency situation occurs and a shading object needs to be deactivated and/or retracted, then an individual can press the shutoff button or actuator. For example, if high winds occur, a fire is in the area, or all wireless communications are cut off, an individual can immediately deactivate and/or shutdown a shading object.

[00213] In embodiments, a base assembly may also a base motor controller PCB, a base motor, a drive assembly and/or wheels. In embodiments, a base assembly may move to track movement of the sun, wind conditions, and/or an individual's commands. In embodiments, a shading object movement control PCB may send commands, instructions, and/or signals to a base assembly identifying desired movements of a base assembly. In embodiments, a shading computing device system (including a SMARTSHADE and/or SHADECRAFT application) or a desktop computer application may transmit commands, instructions, and/or signals to a base assembly identifying desired movements of a base assembly. In embodiments, a base motor controller PCB may receive commands, instructions, and/or signals and may communicate commands and/or signals to a base motor. In embodiments, a base motor may receive commands and/or signals, which may result in rotation of a motor shaft. In embodiments, a motor shaft may be

connected, coupled, or indirectly coupled (through gearing assemblies or other similar assemblies) to one or more drive assemblies. In embodiments, a drive assembly may be one or more axles, where one or more axles may be connected to wheels. In embodiments, for example, a base assembly may receive

commands, instructions and/or signal to rotate in a counterclockwise direction approximately 15 degrees. In embodiments, for example, a motor output shaft would rotate one or more drive assemblies rotate a base assembly approximately 15 degrees. In embodiments, a base assembly may comprise more than one motor and/or more than one drive assembly. In this illustrative embodiment, each of motors may be controlled independently from one another and may result in a wider range or movements and more complex movements.

[00214] In embodiments, a shading object may also comprise a wind turbine 866. In embodiments, one or more wind turbines 866 may be installed and/or positioned on a shading fabric 715, shading object arms / blades 109, and/or arm support assemblies 108. In embodiments, one or more wind turbines may be installed and/or positioned on a central support assembly. In embodiments, one or more wind turbines 866 may comprise one or more propeller-like blades, which are turned my energy of the wind. In embodiments, one or more blades may be moved about a rotor, which is connected to a shaft, causing rotation of a shaft. Rotation of a shaft may spin a generator which results in generation of electricity (e.g., voltage and/or current). In embodiments, a wind turbine generator 866 may be connected to a rechargeable battery 820 and may supply power to recharge a battery. In embodiments, a wind turbine generator 866 may be connected to other shading object components and provide power (e.g., voltage and/or current) to other shading object components.

[00215] In embodiments, a shading object stem assembly 106 may be comprised of stainless steel. In embodiments, a shading object stem may be comprised of a plastic and/or a composite material, or a combination of materials listed above. In embodiments, a shading object stem assembly 106, a base assembly 105, and/or a center support assembly 107 may be comprised and/or constructed by a

biodegrable material. In embodiments, a shading object stem assembly 106 may be tubular with a hollow inside except for shelves, ledges, and/or supporting assemblies. In embodiments, a shading object stem assembly 106 may have a coated inside surface. In embodiments, a shading object stem assembly 106 may have a circular circumference or a square circumference. In embodiments, a shading object stem assembly 106 may be a separate physical structure from a shading object center support assembly 107. In embodiments, a shading object stem assembly and a shading object or umbrella center support assembly may be one physical structure. In embodiments, for example, a shading object stem assembly 106 and a shading object center support assembly 107 may be comprised of one extruded material (e.g. , a single tubular structure of, for example, stainless steel). [00216] In embodiments, a shading object center support assembly 107 may be comprised of stainless steel. In embodiments, a shading object center support assembly 107 may be comprised of a metal, plastic and/or a composite material, or a combination thereof. In embodiments, a shading object center support assembly 107 may be comprised of wood, steel, aluminum or fiberglass. In embodiments, a shading object center support assembly may be a tubular structure, e.g. , may have a circular or an oval circumference. In embodiments, a shading object center support assembly 107 may be a rectangular or triangular structure with a hollow interior. In embodiments, a hollow interior of a shading object center support assembly 107 may have a shelf or other structures for holding or attaching assemblies, PCBs, and/or electrical and/or mechanical components. In

embodiments, for example components, PCBs, and/or motors may be attached or connected to an interior wall of a shading object center assembly.

[00217] In embodiments, a plurality of arms/blades 109 and/or arm support assemblies 108 may be composed of materials such as plastics, plastic

composites, fabric, metals, woods, composites, or any combination thereof. In an example embodiment, arms/blades 109 and/or arm support assemblies 109 may be made of a flexible material. In an alternative example embodiment, arms/blades 109 and/or arm support assemblies 108 may be made of a stiffer material.

[00218] In embodiments, a shading object center support assembly 107 may also include a light sensor (not shown). In embodiments, an illumination or projection source that can project light and/or videos onto surfaces of a shading object, arms/blades and/or shading fabric.

[00219] In embodiments, a center support assembly 107 may comprise an audio transceiver 865 and/or speakers 875. An audio device, such as an iPhone, a digital music player, or the like, may be electronically coupled to the audio transceiver 865 and transmit and/or receive audio signals from the audio device. In an

embodiment, an audio transceiver 865 may receive audio signals and transfer audio signals to the speakers 875 so that speakers may reproduce and play sound for shading object users to hear. In an embodiment, audio signals may be transmitted wirelessly between the audio device and the audio transceiver 865, and/or the audio receiver 865 and the speaker 875.

[00220] FIG. 13 is a block diagram of multiple components within a shading object. The shade object system 1300 includes user interfaces such as a keypad 1302, a display 1304 (e.g. , such as a LCD display), and/or a touchscreen 1306. In an embodiment, user interfaces may be part of a control panel which may be used to input instructions to an intelligent shading object. For example, a user could use the touchscreen to enter instructions or commands to cause a shading object to open or deploy shading elements, play music, project light onto surfaces, adjust shading elements to move shading area, provide misting and/or fog in shading area, and other similar actions. In an embodiment, a touchscreen may be on a computing device (e.g. , which may be personal computer, a laptop, a network computer, a tablet, and/or a smart phone). In an embodiment, a control panel (including one or more of the keypad 1302, display 1 304, and touchscreen 1306) may be mounted to a support frame of the shading object or may be integrated into a remove control device that communicates with controllers or processors in an intelligent shading object 1300 via wireless or wired communication protocol. [00221] In an embodiment, as illustrated in Fig. 13, a shading object system 1300 may include a processor 1308, a clock 1301 , a memory subsystem 1307, and/or glue logic 1376. Glue logic 1376 may allow different components within an intelligent shading object system to interface with each other. For example, glue logic 1376 may allow a processor 1308, a memory subsystem 1307 and/or a clock 1301 to interface with one another. In an embodiment, a processor 1308 interfaces with many components of an intelligent shading object 1300. As a non-limiting example, a processor 1308 may directly, or indirectly, interface with a touch screen 1306, a display 1304, the keypad 1302, an audio amplifier 1380, a stepper motor interfaces 1370, and a transceiver 1310 for receiving Global Navigation Satellite Systems, Blue Tooth and WiFi signals, a battery management system 1336, as well as many other components.

[00222] In an embodiment, a memory subsystem 1307 may comprise memory such as FLASH ROM, RAM, and/or SDRAM. In an embodiment, FLASH ROM and/or SDRAM may be utilized to store software and instructions, which when executable by a processor or processors 1308 and/or controllers, may cause an intelligent shading object system to perform operations and receive and/or transmit information. In an embodiment, FLASH ROM may be updated with new software and/or instructions. In an embodiment, RAM or SDRAM of a memory subsystem 1037 may be utilized as memory that is used by a processor 1308 to execute programs and perform software operations. In an embodiment, a clock 1301 may provide a timing reference for a processor 1308.

[00223] In an embodiment, a shading object system 1300 may also comprise a transceiver 1310 for receiving information from outside systems such as Global Positioning Satellites, Bluetooth-enabled computing devices, and/or WiFi-enabled computing devices. In an embodiment, a transceiver 1310 may comprise an antenna 131 1 , a Bluetooth transceiver 1313, a GNSS transceiver 1312, and/or a WiFi transceiver 1314. In an embodiment, a GNSS transceiver 1312 may utilize the antenna 131 1 to receive GPS signals from GPS satellites and gather

positioning information for an intelligent shading object system 1300. In an embodiment, positioning information may allow an intelligent shading object system to receive weather (e.g. , temperature, humidity, wind) information, and/or predict environmental information by receiving predictions from an almanac and/or other weather forecasting system. In an embodiment, positioning information may also allow a subsystem to understand potential intensity and strength of sun in the location where an intelligent shading object resides. For example, if a positioning information indicates the shading object system 1300 is located in a Mojave Desert in California, then positioning information lets a shading object system may know an intensity of a sun is higher in a Mojave Desert than in a northern portion of Alaska.

[00224] In an embodiment, a Bluetooth transceiver 1313 may utilize an antenna 131 1 to receive and/or transmit signals and/or information to other Bluetooth- enabled devices. For example, in an embodiment, a user may utilize a mobile phone with Bluetooth capabilities to control operation of an intelligent shading object system and/or to stream audio and/or video to an intelligent shading object system 1300 for playing via speakers 1381 and/or headphones 1382 (after passing through a processor or controller 1308 and an audio amplifier 1380). In addition, in an embodiment, a WiFi transceiver 1314 may utilize an antenna 131 1 to receive and/or transmit signals and/or information to other electronic devices having WiFi capabilities. For example, a user may utilize a mobile phone with WiFi capabilities to control operation of a shading object system and/or to stream audio and/or video to an intelligent shading object system 1300 for playing via speakers 1381 and/or headphones 1382. In addition, a WiFi transceiver 1314 and/or Bluetooth

transceiver 1313 may be utilized to communicate with a light or video projector (not shown) (e.g. , transmit video and/or audio signals to the projection device) which may project video and/or light onto a plurality of shading elements of a shading subsystem 1300. Communications with a speaker 1381 and/or headphones 1382 and/or projector may occur after a transceiver 1310 has sent signals through a processor / controller 1308, and/or an amplifier 1380 (for audio signals).

[00225] In an embodiment, an intelligent shading object system 1300 may also include a power subsystem. In an embodiment, a power subsystem may include an AC power adapter 1327, DC power devices 1328 (e.g. , car chargers), solar photovoltaic panels 1330, a rechargeable battery 1335 (such as a Lithium-Polymer Rechargeable Battery (LIPO)), and a battery management system 1336. In an embodiment, an AC power adapter 1327 may receive power from an AC power source 1326, which may also include a fuse. In an embodiment, an AC power adapter may provide power to a system power supply 1337 and/or battery 1335. Similarly, in an embodiment, a DC charger 1328 (which may include a fuse), may provide voltage and/or current to a system power supply 1337 and/or a

rechargeable battery 1335. In an embodiment, an overvoltage protection circuit 1329 may protect a system power supply 1337 and/or a battery 1335 from overvoltage spikes in providing of voltage and current to other components in an intelligent shading object system. [00226] In an embodiment, solar photovoltaic panels 1330 may provide voltage and current to a system power supply 1337 and/or a rechargeable battery 1337. In an illustrative embodiment, solar photovoltaic panels 1330 may be coupled to an overvoltage protection module 1329 to protect a shading object system from overvoltage conditions. In addition, in an embodiment, solar photovoltaic panels 1330 may be coupled or connected to a power storage system before transferring voltage to a system power supply 1337 and/or a rechargeable battery 1 135. In an embodiment, a battery management subsystem 1336 may provide DC power to a shading object system 1300. A battery management subsystem 1336 may include a fuel gauge module 1337 to identify how much power is in a shading object system 1300. This information may be provided to a processor 1308 and then displayed on a LCD display 1304 and/or touch screen 1306. In an illustrative embodiment, a battery management system 1336 may also include a battery protection circuit 1339 to protect the battery from overvoltage, overcurrent, undervoltage and/or undercurrent conditions. In an embodiment, a battery management system 1336 may also include a battery charger 1340, which may recharge a battery 1335. In an illustrative embodiment, an AC adapter 1327 may provide voltage and current to a system power supply 1337. In an embodiment, a system power supply 1337 may provide voltage and current to the components of the shading object system 1300. In addition, a system power supply 1337 may provide voltage and current to a battery charger 1340, which in turn may provide power to a rechargeable battery 1335. [00227] In an embodiment, a shading object system may also include a communications interface. The communications interface may include a USB 2.0 or 3.0 device 1324 and a USB or other serial interface module 1325. In an illustrative embodiment, a USC 2.0 or 3.0 device 1324 and/or the serial interface module 1325 may communicate with a processor 1308 and/or a battery

management subsystem 1336.

[00228] In an embodiment, an intelligent shading object system may also include an analog signal conditioning subsystem. In an embodiment, an intelligent shading object system (and the analog signal conditioning system) may include a plurality of sensors 1320, reference signal modules 1321 , signal conditioning modules 1323 and an analog-to-digital converter 1322. In an embodiment, sensors 1320 may receive analog signals and transmit the analog signals to signal conditioning modules 1323 that are received and processed or conditioned by a signal conditioning modules 1323. In an embodiment, signals may be transmitted and/or transferred by signal conditioning modules 1323 and then transferred to an A-to-D converter 1322. In an embodiment, a signal reference module 1321 may be a nonvolatile memory that stores signal values that the received signal values may be compared to in order to determine if threshold conditions are met. In an

embodiment, this may allow the shading object subsystem to understand if normal conditions are being experienced, or if an intelligent shading object subsystem is in abnormal conditions, (e.g. , high humidity, high movement, high wind, etc..) In embodiments, any of the sensors described above, may communicate

measurements to a signal conditioning module. In embodiments, a signal conditioning module may compare these signals to reference signals supplied by and/or stored in a memory. In embodiments, a result of a comparison and/or the sensor measurements may be converted to digital signals by an A-to-D convert and then transferred to a controller/processor. The controller and/or processor may determine what actions to take based on sensor input modules. For example, in embodiments, a processor or controller 1308 may analyze the received

measurements and may determine that a dangerous or out-of -tolerance condition is occurring. In an embodiment, a processor 1308 may then transmit a signal to a motor interface module 1370 indicating that a shading object system should be placed in a rest or closed position (e.g. , in a position where shading elements are not deployed) and or moved in a certain fashion.

[00229] In an embodiment, a plurality of sensors may also include a thermistor (for measuring wind speed in the shading object environment), a 3D gyro or tilt sensor (for measuring wind resistance in the shading object environment), a 3D accelerometer sensor (for measuring wind resistance and base stability), a humidity sensor (for measuring humidity in the shading object environment), a temperature sensor (for measuring temperature in the shading object environment), and/or a light sensor (for measuring sun intensity and/or sun intensity direction). [00230] In an embodiment, an intelligent shading object system 1300 may be monitoring wind and other potentially dangerous weather conditions on a periodic, continuous, or as directed basis. In an embodiment, a thermistor, 3D gyro or tilt sensor, and/or 3D accelerometer sensor may receive readings and generate signals indicating an environment including wind conditions where an intelligent shading object resides. In an embodiment, wind condition and other stability measurements received via sensors may be conditioned by a signal conditioning module 1323, compared to reference signals supplied by signal reference module 1321 , converted to digital signals by a A-to-D converter 1322 and transferred to a controller or processor 1308. In an embodiment, a processor or controller 1308 may analyze the received wind condition or other measurements and may determine that a dangerous or out-of -tolerance condition is occurring. In an embodiment, a processor 1308 may then transmit a signal to a stepper motor interface module 1370 indicating that a shading object system should be placed in a rest or closed position (e.g. , in a position where shading elements are not deployed). In an embodiment, a stepper motor interface module 1370 may transmit a signal, command, and/or instructions to a shade parking mechanism module 1351 . In an embodiment, a shade parking mechanism 1351 may cause the shading elements and/or other components of an intelligent shading system to be placed in a closed or rest position where the stability of the shading system is not an issue.

[00231] In an embodiment, a processor 1308 may instead determine that one or more shading elements may be moved in a specific direction and/or placed in a specific orientation so as to avoid a dangerous or out-of-tolerance condition. In an illustrative embodiment, a processor 1308 may transmit a signal, command, and/or provide instructions to a stepper motor interface 1 370, which may communicate with a power amplifier 1371 , which may transmit a signal to an azimuth stepper motor 1372. In an illustrative embodiment, an azimuth stepper motor 1372 may move a shading object system in a horizontal manner (in this case to move shading elements (or other components of a shading object system) away from a dangerous or out-of-tolerance weather condition).

[00232] In an embodiment, a humidity sensor and/or a temperature sensor of a plurality of sensors 1320 may generate signals indicative of humidity and/or temperature readings in an environment in which a shading system is installed and/or located. In an embodiment, a signal having a values indicative of the temperature and humidity may be conditioned by a signal conditioning module 1323, compared to reference signals from a reference module 1321 , converted to a digital signal by the A-to-D converter 1322 and transferred to a processor and/or controller 1308. In an embodiment, a processor or controller 1308 may analyze received temperature and/or humidity signals and/or readings, and determine whether to turn on a cooling and fog system 1351 . In an embodiment, a controller or processor 1308 may transmit a signal to a cooling logic module/air conditioning 1350 regarding received temperature and/or humidity signals. In an embodiment, a cooling module 1350 transmits signals, commands, and/or instructions which may cause a cooling and misting system 1351 to turn on and provide fog and/or mist in an intelligent shading object. In an embodiment, a cooling and misting system 1351 may also include a fan. In an embodiment, a controller 1308 and/or cooling logic 1307 may determine the intensity and duration of the misting or fogging in the environment.

[00233] In an embodiment, an intelligent shading object (and/or sections of an intelligent shading object) may be controlled by a number of electronic devices. For example, in this context, sections may comprise a shading element (or shading element), a first motor and controller, a second motor and controller, a support unit, a storage unit, a misting system, a Bluetooth unit, a power and/or battery

management system, a projection unit, and/or a base unit. In an embodiment, a controlling electronic device may be a remote control, a laptop computer, a cellular phone, a smart phone, a tablet, a digital music player and/or other computing devices. In an embodiment, one electronic device (or computing device) may control most functions and in another embodiments, one or more electronic devices (or computing devices) may control different sections of a shading object. This may be beneficial when an electronic device becomes non-operational (e.g., loses power or is out of range) or when only a specific electronic device may be operated with a specific environment.

[00234] In an embodiment, a shading object may comprise a wireless digital music player. In an illustrative embodiment, a wireless digital music player may comprise a Bluetooth MP3 player. In an embodiment, a controller or processor may be separate and/or independent of a controller. In an embodiment, a controller or processor and a wireless digital music player may be integrated on a chip. In an embodiment, a wireless digital music player may pair with a digital audio music source. In an embodiment, a digital music player may establish a communication session with a digital audio music source. In an embodiment, a digital audio music source may be a smartphone and/or a computing device. After a wireless digital music player has established a communication session with a digital audio music source, a digital audio music source may stream digital audio to a wireless digital music player. A wireless digital music player may receive digital audio and transmit received digital audio to a speaker for playback. In an

embodiment, a wireless digital music player may have an integrated speaker, or alternatively, a speaker assembly may be located on support stand, a base assembly, or a shading element of a shading object. In an embodiment, a wireless digital music player may comprise a user interface for controlling operation, such as up/down volume, pause, turning device on/off. In an embodiment, a shading object may comprise a control panel for communicating with and/or controlling operation of a wireless digital music player, such as up/down volume, pause, turning device on/off. In an embodiment, a digital audio music source (e.g., smartphone) may comprise a user interface for communicating with and/or controlling operation of a wireless digital music player. In embodiments, intelligent shading charging systems may have one or more digital cameras installed thereon or integrated within. In embodiments, digital cameras may provide images, video, and/or audio from locations where intelligent shading charging systems are located and/or installed. By having multipurpose intelligent shading charging systems located in various places around outdoor locations, owners or patrons of outdoor locations may monitor and/or view locations as well as events occurring at locations. In

embodiments, this is an improvement over current shading systems and/or charging systems because the intelligent shading charging systems may be utilized for security purposes at event locations and/or outdoor locations.

[00235] In embodiments, computer-readable and computer-executable

instructions may be fetched from a non-volatile memory, loaded into a volatile memory, and may be executed by a processor in a computing device to operate a security system and/or to perform an outdoor location security process (if an intelligent shading charging system is utilized in an outdoor location such as ones described above). This may occur in response to a user selecting a home security button or icon on a dashboard of a shading object, intelligent umbrella, or intelligent shading charging system mobile and/or software application. Operation of a security process for public outdoor locations (e.g., building plazas, parking lots, outdoor sports fields and/or event fields (or a security portion/process of a software application) may be controlled by a computing device. In embodiments, a security process (e.g., initiated by selection in a mobile application or another software application) may receive communicated live and/or almost real-time video, image and/or sound feeds and may also set quality of image readings. In embodiments, video, sound and/or image feeds, data, or information may be stored in a memory 1030 or 1035 of an intelligent shading charging system computing device, a memory of a cloud server, a memory of an application server, and/or databases. In embodiments, an integrated computing device 860 can analyze video, sound, and/or images and issue safety alerts based on analyzation of video and/or images, motion detector activity, and/or over threshold air quality readings from an air quality sensor. In embodiments, an integrated computing device 860 may communicate captured video, sound and/or images to an external device, such as an existing security application server of public outdoor locations, such as ADT Security or other building or outdoor location security operations, which could enhance ADTs security capabilities, and/or also provide a platform for cross- promotion of security system software. In embodiments, an integrated computing device 860 may execute an outdoor environment security process and

communicate video and/or images, with or without sound, to emergency

responders (e.g. , police, fire, security responders, FEMA) to aid in dealing with emergency situations. In embodiments, in emergency situations, digital cameras may communicate video and/or images, with or without sound, to emergency responders (e.g. , police, fire, security responders, FEMA) to aid in dealing with emergency situations. [00236] For example, as illustrated in Fig. 13A, an outdoor parking lot in a plaza between a number of buildings 1305 may have one or more intelligent shading charging systems installed. In Figure 13A, eight intelligent shading charging systems 1320 may be located around a parking area 1310. In embodiments, electrical vehicles may juice or rechargeable their electric vehicle batteries at any of the intelligent shading charging systems 1320. In addition, intelligent shading charging systems may comprise cameras to provide images, video, and/or sound of the parking area 1310. In embodiments, a placement of intelligent shading charging systems 1320 may allow visual coverage of most, or a substantial portion of a parking area. In embodiments, as described previously, images, videos, and/or sound may be transmitted to a computing device within an intelligent shading charging system 1320 and/or an external computing devices (e.g. , laptops, tablets, smartphones, servers, application servers, cloud servers, etc.). These images may be utilized by security providers as visual recognition of what is happening in an environment. In embodiments, intelligent shading charging systems may record and/or capture images, video and/or sound in real-time or substantially close to real-time. In embodiments, cameras in intelligent shading charging systems may be activated by motion detection, proximity detection, or by remote activation, as discussed above, by an integrated computing device and/or external computing device. [00237] For example, as illustrated in Figure 13B, one or more intelligent shading charging systems may be installed in an outdoor event area, such as an outdoor sports complex (e.g. , for soccer, lacrosse, baseball, football, etc.) or an outdoor event location (e.g. , movie theatre, concert, fair, online game playing festival.). In embodiments, a computing device and mobile app may allow individuals to diagnose problems with shading object, intelligent umbrella and an intelligent shading charging system operation. In Figure 13B, eight intelligent shading charging systems 1320 may be located in, around or adjacent to a parking area 1340. In embodiments, electrical vehicles may recharge their electric vehicle batteries at any of the intelligent shading charging systems 1320. In addition, in embodiments, intelligent shading charging systems may comprise cameras to provide images, video, and/or sound of the parking area 1340 and/or fields 1345 where events and/or sporting events are happening and/or taking place. In embodiments, an intelligent shading charging system 1320 may have one or more cameras facing a parking area and capturing images, video, and/or sound for security purposes and one or more cameras facing and/or directed to athletic fields and/or event fields 1345 to capture images, videos, and/or sound of events occurring on the athletic and/or event fields 1345 (which may be utilized to keep track of sporting events or other events). In embodiments, such as an embodiment illustrated in Fig. 13B, a parking area 1340 may be located on both ends of athletic and/or event fields to capture more images, videos, and/or sounds of events occurring on such fields. In embodiments, high quality and/or high speed camera equipment may be utilized to capture higher quality images, video and/or sound of events occurring on playing fields. In embodiments, one or more cameras may be directed to a scoreboard or other score keeping device 1350 to allow monitoring of scores on athletic fields and/or information regarding events on the event or athletic fields (e.g. , times of events, sponsorships, and/or emergency situations). In embodiments, a placement of intelligent shading charging systems 1320 may allow visual coverage of most, or a substantial portion of a parking area. In

embodiments, as described previously, images, videos, and/or sound may be transmitted to a computing device within an intelligent shading charging system 1320 and/or an external computing devices (e.g., laptops, tablets, smartphones, servers, application servers, cloud servers, etc.). These images, videos, and/or sounds may be utilized by security providers as visual recognition of what is happening in an environment, by parents or other family members wishing to monitor events occurring on athletic and/or event fields, or by third parties for security purposes. In embodiments, such as illustrated in Fig. 13B, coverage (e.g. , image and video coverage) of a large amount of playing fields and/or event fields may be obtained by placement of a plurality intelligent shading charging systems 1320 at locations around an area that provide viewing angles of athletic and event fields 1345 and proximity to parking areas 1340 (to allow for electric vehicles to have batteries recharged.

[00238] FIGURES 14A and 14B is a block diagram and a flow diagram of a shading object according to embodiments. In embodiments, a shading object 1400 comprises a microcontroller 896, a GPS solar tracking module 805, a micro climate data module 810, a voice recognition module 815. In embodiments, a shading object includes a Bluetooth transceiver 865, class D amplifier and stereo speakers 875, an AC adapter 835, arrays of solar panels 825, a Lilon/LiPo rechargeable battery 820, a solar MPPT Lilon/LiPo Charger or Charging Assembly 830, and DC- to-DC converters 1295. In embodiments, a shading object comprises an obstacle detection module 850 and a wind sensor thermistor 817. In embodiments, a microcontroller 896 may be coupled to an azimuth driver or motor controller 880, an elevation driver or motor controller 885, an extender driver or motor controller 890, each of which are respectively coupled to a respective DC Brushed motor 212, 121 and 610. In embodiments, one or more of the DC brushed motors 212, 121 and 610 are coupled and/or connected to an encoder feedback quadrature and absolute module 1421 . In embodiments, an encoder feedback quadrature and absolute module 1421 provides positioning and/or location information about how far a DC brushed motor 212 and/or gearbox assemblies or linear actuators have moved in response to commands, instructions, and/or signals from, for example, the azimuth driver 880. This location and/or position information may be feedback to a microcontroller or processor 896 and the microcontroller/processor 896 may adjust the commands, instructions and/or signals directed to, for example, the azimuth driver 880. [00239] In embodiments, a shading object and/or umbrella may comprise a high efficiency LED driver 1 1 15 and LED lights, a system volt and current sense module and/or circuit 1435, an emergency shutdown switch 1430, a display (e.g. , OLED display) 1425, a mist generator system 1420, and/or a USB power source. In embodiments, a user may depress an emergency shutdown switch 1430 to kill or top operations of a shading object. In embodiments, an emergency shutdown switch and/or an on/off switch may be pressed or depressed to resume and/or restart operation. This allows an operator and/or individual to stop movement and/or operation of a shading object in emergency situations, such as when electrical mechanical components and/or computing systems are not operating.

[00240] In embodiments, a shading object and/or umbrella may comprise a system volt & current sense circuit 1435 to determine if a shading object is operating outside recommended settings, which may result in dangerous

operations. If an out-of-threshold condition is detected by a volt and current sense circuit 1435, a shading object controller may send a shutdown or minimize operation command, instruction and/or signal. This feature may be beneficial if a power source is experiencing spikes and/or surges and may protect components and/or assemblies of a shading object. In addition, a volt and current sense circuit 1435 may sense if components and/or assemblies are drawing too much power (and thus causing dangerous conditions) and may cause commands to be sent from the motion control PCB 895 to stop and/or minimize operations. In addition, a voltage and current sense circuit 1435 may communicate, e.g. , via the motion control PCB 895 or directly, alert commands, signals, instructions and/or messages to a sound reproduction system (amplifier and or speaker 875) and/or a display device (e.g. , OLED display 1425).

[00241] In embodiments, an AC adapter 835 and one or more arrays of solar panels 825 may connect and/or plug-in to a charging assembly 830. In

embodiments, a charging assembly 830 may comprise a MPPT Lilon/LiPo

Charging Assembly or Charger. In embodiments, a charging assembly 830 may provide power to and/or charge a rechargeable battery. In embodiments, a rechargeable battery 820 may be a Lilon/LiPro rechargeable battery 820. In embodiments, an AC adapter 830 and one or more arrays of solar panels 825 may charge a rechargeable battery 820 (either directly or indirectly). In some

circumstances, a power draw (e.g. , a voltage and/or current draw) may be too great for only one of the AC adapter 830 or one or more arrays of solar panels 825 to provide power. For example, if one or more assemblies of the intelligent shading object is moving, a large amount of current is needed to power the motor and/or assemblies and neither the AC adapter nor array of solar panels may provide this power. In embodiments, a charging assembly 830 may provide power to one or more DC-to-DC converters 1295. In embodiments, a rechargeable battery may provide power to one or more DC-to-DC converters 1295. In embodiments, DC-to- DC converters 1295 may provide power (e.g. , voltage and/or current) to other assemblies and/or components in the intelligent shading object or umbrella. For example, the DC-to-DC converter 1296 may provide power to a motion control PCB 895, any of the motor assemblies, a computing device 860, and/or a sensor module 805 housing telemetry sensors and/or weather variable sensors. In embodiments, some other components may be self -powered, e.g. , include and/or integrate batteries. In embodiments, an intelligent shading object may also include power storage components, e.g. , capacitors. In embodiments with power storage components, an AC adapter and/or one or more solar arrays may provide power to a power storage components and the power storage components may provide power to a rechargeable batteries 820. In embodiments, anan AC adapter 835 and/or arrays of solar panels 825 may provide power to a rechargeable battery 820, and a rechargeable battery 820 may provide power to power storage components. Continuing with this illustrative embodiment, power storage components may be coupled and/or connected to DC-to-DC converters 1295 to provide power to intelligent shading objects assemblies and components. This provides benefit of an intelligent shading object being able to compensate for high current flow during operations and not having to deal with charge/discharge cycles of a rechargeable battery. In embodiments, a charging assembly 830 may monitor power input (e.g. , amount of current flow) from a power source (e.g. , AC adapter and/or one or more array of solar cells. In embodiments, a charging assembly may communicate a value and/or measurement (in response to a request or command asking for current level) indicating an amount of charge remaining in a rechargeable battery 820 (e.g. , a current level). In embodiments, a charging assembly 830 may also monitor solar panel array output and/or efficiency as well as AC power quality. [00242] Some discussions may be focused on single shading objects, intelligent umbrellas, and/or intelligent shading charging systems. However, descriptions included herein may be applicable to multiple shading objects, intelligent umbrellas and/or intelligent shading charging systems. In addition, while discussions may be directed to a software application or process executing on a computing device of a shading object, intelligent umbrella and/or intelligent shading charging system and controlling one shading object, intelligent umbrella and/or intelligent shading charging system, the descriptions also apply to controlling and/or communicating with multiple shading objects, intelligent umbrellas and/or intelligent charging systems.

[00243] A computing device may be a server, a computer, a laptop computer, a mobile computing device, and/or a tablet. A computing device may, for example, include a desktop computer or a portable device, such as a cellular telephone, a smart phone, a display pager, a radio frequency (RF) device, an infrared (IR) device, a Personal Digital Assistant (PDA), a handheld computer, a tablet computer, a laptop computer, a set top box, a wearable computer, an integrated device combining various features, such as features of the forgoing devices, or the like.

[00244] Internal architecture of a computing device includes one or more

processors (also referred to herein as CPUs), which interface with at least one computer bus. Also interfacing with computer bus are persistent storage

medium/media, network interface, memory, e.g., random access memory (RAM), run-time transient memory, read only memory (ROM), etc., media disk drive interface, an interface for a drive that can read and/or write to media including removable media such as floppy, CD-ROM, DVD, etc., media, display interface as interface for a monitor or other display device, keyboard interface as interface for a keyboard, mouse, trackball and/or pointing device, and other interfaces not shown individually, such as parallel and serial port interfaces, a universal serial bus (USB) interface, and the like. [00245] Memory, in a computing device and/or an intelligent shading object system, interfaces with computer bus so as to provide information stored in memory to processor during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code or logic, and/or computer-executable process steps, incorporating functionality described herein, e.g. , one or more of process flows described herein. CPU first loads computer-executable process steps or logic from storage, e.g. , memory 1004, storage medium/media, removable media drive, and/or other storage device. CPU can then execute the stored process steps in order to execute the loaded

computer-executable process steps. Stored data, e.g. , data stored by a storage device, can be accessed by CPU during the execution of computer-executable process steps. [00246] Persistent storage medium/media is a computer readable storage

medium(s) that can be used to store software and data, e.g. , an operating system and one or more application programs, in a computing device or storage subsystem of an intelligent shading object. Persistent storage medium/media also be used to store device drivers, such as one or more of a digital camera driver, monitor driver, printer driver, scanner driver, or other device drivers, web pages, content files, metadata, playlists and other files. Persistent storage medium/media 1006 can further include program modules/program logic in accordance with embodiments described herein and data files used to implement one or more embodiments of the present disclosure. [00247] A computing device or a processor or controller may include or may execute a variety of operating systems, including a personal computer operating system, such as a Windows, iOS or Linux, or a mobile operating system, such as iOS, Android, or Windows Mobile, or the like. A computing device, or a processor or controller in an intelligent shading controller may include or may execute a variety of possible applications, such as a software applications enabling communication with other devices, such as communicating one or more messages such as via email, short message service (SMS), or multimedia message service (MMS), including via a network, such as a social network, including, for example,

Facebook, Linkedln, Twitter, Flickr, or Google+, to provide only a few possible examples. A computing device or a processor or controller in an intelligent shading object may also include or execute an application to communicate content, such as, for example, textual content, multimedia content, or the like. A computing device or a processor or controller in an intelligent shading object may also include or execute an application to perform a variety of possible tasks, such as browsing, searching, playing various forms of content, including locally stored or streamed content. The foregoing is provided to illustrate that claimed subject matter is intended to include a wide range of possible features or capabilities. A computing device or a processor or controller in an intelligent shading object may also include imaging software applications for capturing, processing, modifying and transm itting image files utilizing the optical device (e.g., camera, scanner, optical reader) within a mobile computing device. [00248] Network link typically provides information communication using transmission media through one or more networks to other devices that use or process the information. For example, network link may provide a connection through a network (LAN, WAN, Internet, packet-based or circuit-switched network) to a server, which may be operated by a third party housing and/or hosting service. For example, the server may be the server described in detail above. The server hosts a process that provides services in response to information received over the network, for example, like application, database or storage services. It is contemplated that the components of system can be deployed in various configurations within other computer systems, e.g., host and server.

[00249] For the purposes of this disclosure a computer readable medium stores computer data, which data can include computer program code that is executable by a computer, in machine readable form. By way of example, and not limitation, a computer readable medium may comprise computer readable storage media, for tangible or fixed storage of data, or communication media for transient

interpretation of code-containing signals. Computer readable storage media, as used herein, refers to physical or tangible storage (as opposed to signals) and includes without limitation volatile and non-volatile, removable and non-removable media implemented in any method or technology for the tangible storage of information such as computer-readable instructions, data structures, program modules or other data. Computer readable storage media includes, but is not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other physical or material medium which can be used to tangibly store the desired information or data or instructions and which can be accessed by a computer or processor.

[00250] For the purposes of this disclosure a system or module is a software, hardware, or firmware (or combinations thereof), process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation). A module can include sub-modules. Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.

[00251] Those skilled in the art will recognize that the methods and systems of the present disclosure may be implemented in many manners and as such are not to be limited by the foregoing exemplary embodiments and examples. In other words, functional elements being performed by single or multiple components, in various combinations of hardware and software or firmware, and individual functions, may be distributed among software applications at either the client or server or both. In this regard, any number of the features of the different

embodiments described herein may be combined into single or multiple

embodiments, and alternate embodiments having fewer than, or more than, all of the features described herein are possible. Functionality may also be, in whole or in part, distributed among multiple components, in manners now known or to become known. Thus, myriad software/hardware/firmware combinations are possible in achieving the functions, features, interfaces and preferences described herein. Moreover, the scope of the present disclosure covers conventionally known manners for carrying out the described features and functions and interfaces, as well as those variations and modifications that may be made to the hardware or software or firmware components described herein as would be understood by those skilled in the art now and hereafter. [00252] While certain exemplary techniques have been described and shown herein using various methods and systems, it should be understood by those skilled in the art that various other modifications may be made, and equivalents may be substituted, without departing from claimed subject matter. Additionally, many modifications may be made to adapt a particular situation to the teachings of claimed subject matter without departing from the central concept described herein. Therefore, it is intended that claimed subject matter not be limited to the particular examples disclosed, but that such claimed subject matter may also include all implementations falling within the scope of the appended claims, and equivalents thereof.